Alternator Voltage Booster

This site may earn a commission from merchant affiliate
links, including eBay, Amazon, Skimlinks, and others.

Joined
Jul 20, 2010
Threads
153
Messages
6,176
Location
N43.875, W121.455
Website
www.instagram.com
This looks interesting, but I am an electronics moron, so for all I know it's somebody's phony, backyard snake oil.

Looking forward to reading smarter posts than mine!
 
It's a resistor placed between the regulator sense and battery which "fools" the regulator to create higher voltage to compensate for the voltage drop that takes place when using silicon rectifiers in old style isolators.
 
Last edited:
It's a resistor placed between the regulator sense and battery which "fools" the battery to create higher voltage to compensate for the voltage drop that takes place when using silicon rectifiers in old style isolators.
Actually it's a DIODE placed in the sense circuit, to create a voltage drop so the alternator increases the output by the amount of drop across the rectifier.(If you put a resistor in there would be no drop because you don't get a voltage drop across a resistor unless you have current flowing thru it.)
You can do the same by buying a 5 cent 1 amp diode to replace your 7.5 amp sense fuse. I made one up about 6 years ago which I carry with my spares incase I need it, as the volts get a bit low when hot for a long time in the hot outback.
 
I would think you run the risk of slowly boiling your batteries if the alternator is working properly and masking a problem if it isn't? I'm no expert but I know the DC to DC chargers do more than boost the voltage, there's a voltage / current cycle they follow that better suits the chemistry of the battery to put more charge back into them than an alternator alone can achieve.
 
100TD, what diode exactly are you using? I'm keen to try this out but am not sure what kind of diode to use? Do I need to get one with a forward voltage drop equal to the boost I want?
 
100TD, what diode exactly are you using? I'm keen to try this out but am not sure what kind of diode to use? Do I need to get one with a forward voltage drop equal to the boost I want?

Is there any update on this? I am looking to boost my charging voltage by 0.5 volts or so.

My dual battery system (national luna) usually fails to charge my second battery at idle (after warm-up) due to the lower alternator output (voltage drops below 12.7 volts, and the battery isolator trips and isolates the batteries, especially if I'm running extra lights/accessories). Voltage goes up if I rev the engine slightly, but at pure idle, it drops below the cut off point of the dual battery controller.

What is the normal idle voltage on the 100 series, approx 14 volts? Hopefully the alternator is not on it's way out. The truck's got 125,000 miles.
 
Is there any update on this? I am looking to boost my charging voltage by 0.5 volts or so.

My dual battery system (national luna) usually fails to charge my second battery at idle (after warm-up) due to the lower alternator output (voltage drops below 12.7 volts, and the battery isolator trips and isolates the batteries, especially if I'm running extra lights/accessories). Voltage goes up if I rev the engine slightly, but at pure idle, it drops below the cut off point of the dual battery controller.

What is the normal idle voltage on the 100 series, approx 14 volts? Hopefully the alternator is not on it's way out. The truck's got 125,000 miles.

I have the exact same idle symptoms, but not 12.7 in park, I will see 12.7 ocassionaly in drive at a light. I was wondering if the alternator or voltage regulator was on it's way out. Anybody else have any input on idle Park/Drive at a light voltage?
 
I have the exact same idle symptoms, but not 12.7 in park, I will see 12.7 ocassionaly in drive at a light. I was wondering if the alternator or voltage regulator was on it's way out. Anybody else have any input on idle Park/Drive at a light voltage?

If your volt meter is reading correctly, then I suspect you have an issue with your alternator as the 100 series are usually around the 14.2 to 14.6. My 80 series TD puts out 14.6.

The generally accepted range of volts out of an alternator for charging is between 13.8 volts to 14.6 volts. On some newer vehicles, (2009 onwards) the manufacturers are electing to us ECU controlled alternators to aid fuel economy and emission figures. Once the starting battery/s reach sufficient charge to start the vehicle (usually around 80%) the alternator drops to a supply voltage of around 13.2 volts.

The minimum voltage to charge a wet cell lead acid battery effectively for optimum performance is 13.8, calcium/calcium sealed maintenance free lead acid is around 14.2 and for flat plate fully sealed agm, gel or spiral wound agm the voltage needs to be 14.4.

So far, the vehicles that I know for certain with these alternators are Toyota LC200, all 70 series and prado 150 series. Toyota started using these alternators around 2009/2010.

So using these diode's to trick the alternator is only needed if you are wanting to use one of the newer technology batteries that require 14.2 or 14.4 and your alternator is only putting out 13.8 or less. If you are achieving voltages of 14.6, count yourself lucky to have a perfect charge voltage. Over 14.6, you will cook your batteries.

My only concern with these Alternator Voltage Booster diodes is that some of them require you to replace the 7.5 amp fuse with the diode and the one I tried about 12 months ago did not blow even at 20amps.

Instead I elect to find the wire that runs between the fuse and the alternator and cut and solder a quick connector in. This allows the ability to replace the diode(built into the quick connector) with a simple jumper wire, returning the vehicle to standard but most importantly retaining the factory 7.5amp fuse.

Please remember these figures are for optimum performance and are only for 12 volt systems. :)
 
If your volt meter is reading correctly, then I suspect you have an issue with your alternator as the 100 series are usually around the 14.2 to 14.6. My 80 series TD puts out 14.6.

The generally accepted range of volts out of an alternator for charging is between 13.8 volts to 14.6 volts. On some newer vehicles, (2009 onwards) the manufacturers are electing to us ECU controlled alternators to aid fuel economy and emission figures. Once the starting battery/s reach sufficient charge to start the vehicle (usually around 80%) the alternator drops to a supply voltage of around 13.2 volts.

I don't have a dual battery system but was recently looking into Alternator/starter/battery issues and have some data points:

1) From the 05 LX FSM it says 13.2 to 14.8V is the acceptable range for the alternator (and at least in 98 and 01 LX FSM it says 13.2 to 15.1V is acceptable range.
From 05 LX manual:
"Check the charging circuit as follows:
With the engine running from idling to 2,000 rpm, check
the reading on the ammeter and voltmeter.
Standard amperage:
10 A or less
Standard voltage:
13.2 to 14.8 V
"
2) Personally (05 with 110k mi), I'll get ~14.1 when the engine is idling until it's warmed up and then I get more like ~13.6 most of the time. I assume it's due to the temperature rising leading to lower voltages rather than an ECU controlling it?

The minimum voltage to charge a wet cell lead acid battery effectively for optimum performance is 13.8, calcium/calcium sealed maintenance free lead acid is around 14.2 and for flat plate fully sealed agm, gel or spiral wound agm the voltage needs to be 14.4.

So far, the vehicles that I know for certain with these alternators are Toyota LC200, all 70 series and prado 150 series. Toyota started using these alternators around 2009/2010.

So using these diode's to trick the alternator is only needed if you are wanting to use one of the newer technology batteries that require 14.2 or 14.4 and your alternator is only putting out 13.8 or less. If you are achieving voltages of 14.6, count yourself lucky to have a perfect charge voltage. Over 14.6, you will cook your batteries.
I recently got an AGM Battery and if it really requires 14.4 to fully charge I'll have issues?
Is there a source I can lookup these charging voltages or just common knowledge?

So far, the vehicles that I know for certain with these alternators are Toyota LC200, all 70 series and prado 150 series. Toyota started using these alternators around 2009/2010.
So using these diode's to trick the alternator is only needed if you are wanting to use one of the newer technology batteries that require 14.2 or 14.4 and your alternator is only putting out 13.8 or less. If you are achieving voltages of 14.6, count yourself lucky to have a perfect charge voltage. Over 14.6, you will cook your batteries.
What Alternators are you referring to that Toyota started using in 2009/2010?
 
You're overthinking this. If your battery is a 12 volt battery, and you don't have excessive resistance between the alternator's output and battery post, leave it alone. I realize the battery provider will dictate an optimum voltage, but that voltage is a function of the alternator, regulator, resistance, temperature, load, RPM....etc. It is dynamic and based upon the numbers you've provided, you're fine. The voltage range Toyota provides is wide for a reason. The marginal benefit you'll achieve by goosing your voltage higher probably isn't worth the trouble of manipulating the voltage. If it really bothers you, plug the batteries into the charger at home once a week. Just my opinion.


I don't have a dual battery system but was recently looking into Alternator/starter/battery issues and have some data points:

1) From the 05 LX FSM it says 13.2 to 14.8V is the acceptable range for the alternator (and at least in 98 and 01 LX FSM it says 13.2 to 15.1V is acceptable range.
From 05 LX manual:
"Check the charging circuit as follows:
With the engine running from idling to 2,000 rpm, check
the reading on the ammeter and voltmeter.
Standard amperage:
10 A or less
Standard voltage:
13.2 to 14.8 V
"
2) Personally (05 with 110k mi), I'll get ~14.1 when the engine is idling until it's warmed up and then I get more like ~13.6 most of the time. I assume it's due to the temperature rising leading to lower voltages rather than an ECU controlling it?


I recently got an AGM Battery and if it really requires 14.4 to fully charge I'll have issues?
Is there a source I can lookup these charging voltages or just common knowledge?


What Alternators are you referring to that Toyota started using in 2009/2010?
 
mobi-arc said:
You're overthinking this. If your battery is a 12 volt battery, and you don't have excessive resistance between the alternator's output and battery post, leave it alone. I realize the battery provider will dictate an optimum voltage, but that voltage is a function of the alternator, regulator, resistance, temperature, load, RPM....etc. It is dynamic and based upon the numbers you've provided, you're fine. The voltage range Toyota provides is wide for a reason. The marginal benefit you'll achieve by goosing your voltage higher probably isn't worth the trouble of manipulating the voltage. If it really bothers you, plug the batteries into the charger at home once a week. Just my opinion.

At least personally I don't intend to use a voltage booster but the thread led me to two concerns that seem legit (i.e. not overthinking):

1) if one is on the low end of the alternator voltage range or lower (some posted numbers lower than 13.2) doesn't that indicate either an alternator problem - perhaps new brushes or a new alternator are in order?

2) if an alternator is within toyota spec and it still cant get close to the output needed to charge an agm battery then is it true that the agm can't ever get fully charged? in other words if having an agm battery requires higher voltage than we're capable of getting off our alternators to charge fully, and therefore requires manually charging once a week, personally it is not worth the hassle and I would never have gotten the agm battery. I'd sooner swap out the new diehard platinum for a crappy old school battery IF it's impossible for my alternator to full charge my agm. I don't think manually charging every week is acceptable or worth it (to me).

But this thread was the first where I saw specifics on what voltage an agm battery needs to fully charge. If those numbers were overstated or inaccurate or there's more to the story I'd love to know. But hearing that a manual charge every week is needed is not comforting and does not seem like a real solution - if the real problem is low/out of spec alternator output the fix should be new brushes, new alternator or perhaps for some voltage booster. if the real problem is agm battery demanding more than our systems are capable of and the only solution is either voltage booster or charging manually weekly then at least for me i'll strongly consider reverting to an old school battery.
 
Last edited:
The answer to item one is not necessarily. If the current is high relative to the RPM, your voltage will inherently be low. The voltage reading alone doesn't provide enough information to determine if there's a problem. 13.2v at 100 amps at 1300 rpm would be excellent performance. 14.5 at 20 amps at 6,000 rpm would also be excellent performance provided the battery load is 20 amps.

The answer to item two is what do you consider 100% charge and under what conditions? Alternators don't equalize batteries, they maintain batteries. Under perfect laboratory conditions, I'm sure Odyssey has an optimum charging profile, as well as an acceptable range which will conform to generally accepted automotive ranges in voltage that typical voltage regulators induce.

I'm running an Odyssey and and Odyssey Sears Platinum. I've been running them now for 2 or 3 years. They work fine even though my output voltage varies from below cut-in voltage at idle, to 14.3 at elevated rpms. Again, I'm not trying to be difficult, but I think you are looking for a problem that doesn't exist based upon what Odyssey would like to see for "optimum" battery performance in a perfect world. I don't drive my Land Cruiser every day and as such, I'll hook up the garage charger once every few weeks just for good measure. Is it necessary? Probably not. Will it allow my batteries to last longer? Maybe. It certainly doesn't hurt them. If AGM batteries perform better when elevated voltage is applied so they become fully absorbed, then that's fine. But they're also designed to perform on automotive platforms where the voltage range of "acceptable" doesn't necessarily provide perfectly controlled, lab-ideal voltage and current.

I previously had $65 Interstate lead acid batteries that lasted close to 7 years. We'll see how the Odysseys compare over time.

At least personally I don't intend to use a voltage booster but the thread led me to two concerns that seem legit (i.e. not overthinking):

1) if one is on the low end of the alternator voltage range or lower (some posted numbers lower than 13.2) doesn't that indicate either an alternator problem - perhaps new brushes or a new alternator are in order?

2) if an alternator is within toyota spec and it still cant get close to the output needed to charge an agm battery then is it true that the agm can't ever get fully charged? in other words if having an agm battery requires higher voltage than we're capable of getting off our alternators to charge fully, and therefore requires manually charging once a week, personally it is not worth the hassle and I would never have gotten the agm battery. I'd sooner swap out the new diehard platinum for a crappy old school battery IF it's impossible for my alternator to full charge my agm. I don't think manually charging every week is acceptable or worth it (to me).

But this thread was the first where I saw specifics on what voltage an agm battery needs to fully charge. If those numbers were overstated or inaccurate or there's more to the story I'd love to know. But hearing that a manual charge every week is needed is not comforting and does not seem like a real solution - if the real problem is low/ou of spec alternator output the dix should be new brushes, new alternator or perhaps for some voltage booster. if the real problem is agm battery demanding nore than our systems are capable of and the only solution is either vomtage booster or charging manually weekly then at least for me i'll strongly consider reverting to an old school battery.
 
agaisin - exactly what alternator toyota started using didn't necessarily change. Toyota used the ECU to make the changes to the regulator therefor reducing pressure on the motor - improving fuel economy. Toyota (and all the 4WD manufacturers) simply don't care if you plan to have dual batteries. They build the car and electrical system to work for the car only. And if you never go camping or ever run fridges or plan to put in a dual battery system, then you will never encounter this problem. You will get your 3-4 years out of your starting battery/batteries and all will be good.

In answer to your question about my source, well, this information is sourced from my work experience and from a Battery'oligist with 35 years of experience working for Exide Batteries in Australia and he currently works as my manager in an off road specialist company.

Recently he uncovered the fact that a battery manufacturer has changed the acid content of their wet cell batteries which causes the specifications of a charged battery to be different to what is accepted in the world as we know it for automotive batteries.

Up until now, an automotive flooded lead acid battery cell is fully charged at 2.11 volts. With 6 cells per battery, 12.66 volts was fully charged. With the acid ratio changed the voltage at full charge will now be 12.9volts. Naturally this will affect fully discharged voltage too.

Why change the acid ratio? Well, the manufacturer can "form" the battery far quicker using a higher acid content. Put simply, they save money.

Forming is the process of charging the battery for the first time. Forming changes the lead oxide paste on the positive grids into lead dioxide and the lead oxide paste on the negative grids to metallic sponge lead.

The other problem that our Battery'oligist has uncovered is that in a lot of cases battery manufacturers are stating that their batteries charging range is quite large (13.2 to 14.9). Now this is possibly ok in the case of a starting battery where the capacity is decreased only slightly from starting the car. And normal working alternators are usually putting out enough voltage to "topup" the battery. Then the day comes along that you accidentally leave the lights on, or the radio going, or you use your winch for a reasonably long period (winches can draw up to 400Amps. Most alternators are around 100amps so the battery is needed to provide the extra power) all of a sudden the low voltage alternators will have difficulty pushing past the resistance of a discharged battery.

This becomes even more important when looking at a battery in a cycling application (i.e. Running the fridge) as we tend to deeply discharge our auxiliary batteries in order to keep the beer cool (An extremely important job). If we can ensure that the vehicles charging system is performing at it's peak then you will be able to recharge that battery to 100% just by going for a drive in your vehicle giving you the longest run time for your fridge and accessories after the vehicle is turned off.

If you look at a 3 stage battery charger, you will see that bulk charging is to bring the voltage of the battery up to the required voltage to be able to then start charging the battery (absorption). Most chargers will normally charge at around 14.4 to 14.9 volts(depending on chemistry of the battery). There is a good reason for this. It's what allows a mains power charger to achieve 100% charge of the battery. It is this voltage that we IDEALLY would like from our vehicles charging system for charging the AUXILIARY battery (and also to ensure that our starting battery is in peak performance).

Because many new vehicles have a low charging voltage we (the mob I work for) sell a lot of flooded lead acid deep cycle batteries. They are an "old technology" battery but we know it will charge to 100% at 13.8 volts. Will it charge at a lower voltage? yes but not to 100%. Could we put in an AGM as the auxiliary battery into a vehicle charging at 13.8? Yes but again it will never achieve 100% charge. It might achieve 40-50% if your lucky. Just simply driving for longer will not charge the battery. Of course you could use a mains power charger before leaving for your camping trip and hope that it lasts the weekend, but if your staying for a little longer you may run into problems. And if you don't use a mains power charger regularly then the AGM battery will sulphate and die prematurely.

Voltage is important for charging as voltage is the pressure component of electricity. It's kinda like filling up a water tank from the bottom. You might have a large enough pipe (amps) and plenty of water to use (current) but if you don't have a enough pressure in the pipe then as the tank fills up, it will get to a point where the water will no longer flow due to the pressure of the water already in the tank. Therefore you will need to increase the pressure to fill it all the way up.

Alternatively you could use a DC to DC charger to charge the auxiliary battery at the proper voltage from a wide ranging input voltage. However non fan cooled DC/DC chargers maximum output is only 40amps. This option has no effect on the starting battery and it also means that if you are trying to charge batteries in a caravan/camper trailer, at 40amps it will struggle to charge all the batteries if you don't drive for a long enough period. Ideally you would put another DC/DC charger into the trailer.

In my opinion the best option is to get the alternator to do the job as it has more amps to play with.

If I may give you a real world case to backup my post. My mate Andrew driving a 02 Hilux installed a reputable dual battery management system with an Optima D31A (13" Yellowtop, 75Ah Cycling Capacity) located in the tray of the vehicle. His issue was his Engel Eclipse fridge would last only for 8-10 hours once the vehicle stopped.

Grabbed my multimeter and measured voltage at the starting battery (14.04 volts) then checked at the Optima (13.77 volts). Why the voltage drop? Optima grounded to the chassis however the starting battery was not. Bolted in an earth between motor and chassis, we now had a voltage of 14.02. A drop of .02 (very acceptable). I explained the voltage issue with him and the use of a diode to bring the voltage up but he decided to run with the improved voltage.

He was now achieving around 15-18 hours run time on the Engel Eclipse. Expecting more from that battery he returned and I convinced him to use the diode. He is now achieving 14.61 volts on the starting battery and 14.59 on the optima. On a test run before his fraser island trip, his engel lasted for 58 hours without starting the car. The battery voltage was down to 11.9 volts so it still had some capacity left.

The only change made was the voltage. It might seam over thinking or not important but it makes a difference to how long your fridge runs between running the car.

For the 2 weeks on fraser island he never had a problem and did not need to pull out the generator and battery charger I loaned him.

I guess to summarise, in a starting application, you have more flexibility with the charging voltages, however if you not careful then you could discharge the battery and reduce the live of your battery.

In a deep cycle application or if you use a winch regularly, it is recommended that you achieve the optimum voltages to ensure that you can get the best run time for your winch, fridge and accessories.

This will also generally improve the longevity of your batteries.

At the end of the day, we are trying to achieve the best results for your hard earned money. Enjoy your weekend:)




P.S. just previewed my post and realised it's like I'm writing a book. I'm new to the forum front so bear with me while I learn the ropes.:cheers:
 
CruiserNet: This is a fantastic write-up. Just the knowledge we need for vehicle battery charging. Thank you.

I have extensive experience and training in DC power supplies and battery banks, both for on-shore and off-shore use, in fishing wessels and mountain-top radio stations, and agree fully with your reasoning and conclusions. This really gave me a top-up about the modern batteries, as I might be a bit out-dated.

PS: I'd buy your book ;)
 
Last edited:
Now, for adding a diode in the 100, I found that there is a short-pin in series with the alt-s fuse, so that it is possible to just pull out the short-pin and push in the diode, as long as it comes in a package that fits the socket.
This short pin is in gasoline and diesel vehicles from 7/2002 production, not the earlier years, and sits in the middle of the fuse-box in the engine compartment. It's called Short Pin B, and is a long, black latch across the two rows of fuses. Now, the problem is that this long black bar consists of two short-pins in one package with 4 connectors. One (the interesting one) between the two end pins (pin 1 - 2) and the other between the two inner pins (3-4). The second short-pin/latch is in series with fuse NV-IR, which is the middle fuse in the rear row, next to this short-pin, in some vehicles numbered 6 and in others fuse #10.
In my Owner's Manual, the fuse NV-IR is marked "No Circuit", so I suppose I can just forget about the second part of that short-pin-bar. I haven't been able to find out what NV-IR is. Does anyone know?

Edit: See changes/additions below.
 
Last edited:
I would agree with most of what Cruisernet has stated, except that the change from using a conventional internal or external voltage regulator to controlling the alternator via an ECU output doesn't reduce "pressure" on the motor. The alternator is still under load which still sucks power from the motor, regardless of the regulator location. My understanding is the push to move the regulator to the ECU is like everything else in the automotive world....driven by cost. It's one less component in the alternator subjected to inherent heat. Also, using an ECU-based PWM regulator allows the computer to better control the alternator's response more precisely as it derives feedback like engine load, battery temperature, air temperature, etc.... to constantly optimize the target charging voltage to achieve better fuel economy, load response curve and startup, better emissions by retarding the alternator load until the engine is more efficient, etc.....

With respect to Cruisernet's fridge example, appropriate grounding makes a world of difference, and the stated improved performance doesn't take into consideration the fridge's environmental temperature, amount of goods in the fridge, compressor on-time verses off-time....etc so I'm not quite sure one can expect to achieve 10x (or whatever) better performance by simply bumping the voltage by 3/10's of a volt but to each their own.

I absolutely agree that elevated voltage will achieve more efficient charging of batteries that have a specific chemistry that lend themselves to higher voltage and I absolutely agree that by placing a diode on the regulator's sense circuit, it will fool the alternator into creating higher voltage....this is not a new concept. I suppose you could even use a potentiometer to sweep the full voltage range you want to hit. To that end, you could also use an external adjustable voltage regulator. But you're adding more parts and complexity to a system that works fairly well for the intended purpose, so I guess it just becomes a personal preference issue as to whether one monkeys with the voltage or not.
 
On a completely different tangent (read hijaack), and since Mobi-Arc is active on the thread...

If I were to buy a DC-Power alternator putting out 180-200 amps at idle, is this sufficient to run an on-board welder well?

What're the requirements regarding alternators? Is DC-Power a "known good" brand?
 
Chapter 3

I would agree with most of what Cruisernet has stated, except that the change from using a conventional internal or external voltage regulator to controlling the alternator via an ECU output doesn't reduce "pressure" on the motor. The alternator is still under load which still sucks power from the motor, regardless of the regulator location. My understanding is the push to move the regulator to the ECU is like everything else in the automotive world....driven by cost. It's one less component in the alternator subjected to inherent heat. Also, using an ECU-based PWM regulator allows the computer to better control the alternator's response more precisely as it derives feedback like engine load, battery temperature, air temperature, etc.... to constantly optimize the target charging voltage to achieve better fuel economy, load response curve and startup, better emissions by retarding the alternator load until the engine is more efficient, etc.....

You are most likely correct. We do not specifically deal with how an alternator works. What we do know is that by controlling the alternators output via the ECU Toytoa can reduce the voltage once the "starting" batteries are charged sufficiently (not 100%) to start the car next time which in turn achieved better fuel economy and therefor lower emissions and yes, it is probably a cheaper way of doing it. Exactly how they do this is beyond my knowledge but this is the information that was given to us by the engineers from the inner sanctum of Toyota Australia.


With respect to Cruisernet's fridge example, appropriate grounding makes a world of difference, and the stated improved performance doesn't take into consideration the fridge's environmental temperature, amount of goods in the fridge, compressor on-time verses off-time....etc so I'm not quite sure one can expect to achieve 10x (or whatever) better performance by simply bumping the voltage by 3/10's of a volt but to each their own.

I completely agree that achieving the longest run time for a fridge is dependant on the ambient temp, how well it is stocked and how often one opens a fridge to retrieve its contents.

However that said, the state of charge of a battery would be just as important. If I may demonstrate:

Lets take my mate Andrews situation and assume for a moment that his fridge is in 60 degree ambient, the fridge is jam packed, and he is an excessive drinker so opening the door every 5 mins.

We would now see a compressor on time of 100%.


For our example we will use a 100% charged Optima D31A battery with a rating of 75Ah at the 20hour rate (3.75Amps per hour for 20 hours) as our battery, running any single cabinet Engel fridge (not a fridge/freezer) with a current draw of 2.2amps.

If we have a compressor on time of 100% therefor we will be taking 2.2Amp hours out of the battery we then have an estimated run time of 34 hours before the Optima reaches 10.5 volts.

This is a considerable difference to Andrews actual 15-18 hours. Therefor the only explanation is that the battery in Andrews real world scenario was not being fully charged at 14.02.

(Given that there are the factors of losses through the system and cabling, one could say that the exact figures might vary slightly but not enough to warrant the calculation time. Also the fact that we are drawing less than the 20 hour rate therefor we would actually achieve a higher AmpHour rating due to the discharge curve not being linear)



I absolutely agree that elevated voltage will achieve more efficient charging of batteries that have a specific chemistry that lend themselves to higher voltage and I absolutely agree that by placing a diode on the regulator's sense circuit, it will fool the alternator into creating higher voltage....this is not a new concept. I suppose you could even use a potentiometer to sweep the full voltage range you want to hit. To that end, you could also use an external adjustable voltage regulator. But you're adding more parts and complexity to a system that works fairly well for the intended purpose, so I guess it just becomes a personal preference issue as to whether one monkeys with the voltage or not.

Here, here. mobi-arc - we are a big believer of the KISS principal and yes the diode idea is not new, we have been installing them for the past 2 years. Our first was a 200 series pulling a 20foot caravan with 2 AGM batteries and another AGM under the bonnet. His system was installed by another company (And it was physically installed perfectly with very little voltage loss) and they sent him to us. The vehicles voltage was simply too low to charge the Aux batteries without the use of a mains power charger.

As we had not heard about the diode before, we took a couple of days to come up with the idea and also the method to implement it. As this was a 2 month old $120,000 car, we discussed the idea with the customer and that the Toyota Engineers couldn't give us an idea of the side affects from doing such a thing. After agreeing to the solution we asked him to return or call the second he had a problem. We heard from him 1 month after to say everything is perfect now. He and his wife are still travelling Australia (Half their luck) and have yet to have a flat battery.

We are answering these questions numerous times a day and what we keep hearing is that "I did my research on the INTERNET and there was nothing about this!". :bang:

My aim is to try to spread the word that charge voltage in respect of the type of battery you are using in a cycling application is very important to running your accessories.

Catchya later :)

P.S. Thanks uHu. I'll let you know when it goes on sale.
 
I know Stan at DC Power.....known him for over ten years back when he was with Wrangler NW Power Products. My assumption is that if Stan says it's good, it's good. They're making really nice billet CNC'd front and rear housing which is not a cheap process. Regarding 180 to 200 amps at idle....I would respectfully suggest that sounds optimistic. Regarding DC Electric alternators being used with MOBI as the source of current, it would be a really nice welder and Stan can pre-configure the alternator for direct connection to MOBI. Requirements are no avalanche diodes, able to be regulated externally with either A or B circuit, and a solid robust design with healthy current output. MOBI will take care of the rest.

Feel free to ring us to discuss.


On a completely different tangent (read hijaack), and since Mobi-Arc is active on the thread...

If I were to buy a DC-Power alternator putting out 180-200 amps at idle, is this sufficient to run an on-board welder well?

What're the requirements regarding alternators? Is DC-Power a "known good" brand?
 

Users who are viewing this thread

Back
Top Bottom