- Joined
- Feb 13, 2014
- Messages
- 2,888
- Reaction score
- 3,321
- Location
- Magdeburg, Germany
- VCDS Serial number
- HN0-nnnnnn
Honestly, I don't get how you can even consider a 20A charger to be adequate - let alone recommend it to others. My T6 pulls 40+ A for up to an hour when I connect it after a lot of short distance driving. Considering that all start-stop models only charge the battery to 80-85% (to leave room for recuperation), this is not really a surprise. Even after a few hours, the load (with infotainment etc. already switched off) falls into the 12-18 A range. A charger labelled for 20A, doesn't actually produce as much (tried a current clamp to double check what it actually delivers?!).
Back when I used to do AAPEX with RT, we had a 25A (yellow'ish) charger hooked up to a Q5. Even with systems shut down, the charger was not able to maintain the current draw necessary for the car not to shut down more systems. We had to (de-)code DRLs and stuff in order to get the consumption down... There is a reason why manufacturers recommend 50/70/110 A these days.
I am normally using a Fronius 35A which has all the nice features (true buffer mode for diagnostics, standard charging modes for agm/wet/gel batteries, switchable between 6/12/24V and desulfatation etc.) which has been great for diagnostic and demo purposes (my older Fronius 25A charger is not enough for modern cars - as it was running on MAX current a lot of the time). How does a charger w/o any modes figure which battery is installed? It can't tell by itself if it needs to put out the proper voltage/current for agm/gel/wet. Standard charging differs from buffering as well. Does it recognize whether or not plus/ground are reversed or will it just push out current either way and cause a short? We do complain when customers have shitty chargers and then we blame communication issues and other shit on the charger. Then we turn around and use something like this?! *headscratching*
Long story short, considering how much we all use battery chargers in our work life - is this really the place to save money?
Back when I used to do AAPEX with RT, we had a 25A (yellow'ish) charger hooked up to a Q5. Even with systems shut down, the charger was not able to maintain the current draw necessary for the car not to shut down more systems. We had to (de-)code DRLs and stuff in order to get the consumption down... There is a reason why manufacturers recommend 50/70/110 A these days.
I am normally using a Fronius 35A which has all the nice features (true buffer mode for diagnostics, standard charging modes for agm/wet/gel batteries, switchable between 6/12/24V and desulfatation etc.) which has been great for diagnostic and demo purposes (my older Fronius 25A charger is not enough for modern cars - as it was running on MAX current a lot of the time). How does a charger w/o any modes figure which battery is installed? It can't tell by itself if it needs to put out the proper voltage/current for agm/gel/wet. Standard charging differs from buffering as well. Does it recognize whether or not plus/ground are reversed or will it just push out current either way and cause a short? We do complain when customers have shitty chargers and then we blame communication issues and other shit on the charger. Then we turn around and use something like this?! *headscratching*
Long story short, considering how much we all use battery chargers in our work life - is this really the place to save money?