This is not a straightforward question because the output of the alternator in amps has little to do with how many amps the battery get charged at except that it implies the alternator has low internal resistance which means it can shove out a lot of amps without the voltage dropping too much. The battery charging current is determined by the difference in voltage between the alternator output lets say 14.6 volts and the battery voltage lets say it's partially discharged at 12.4 volts. That voltage difference of a couple of volts may not sound much but the internal resistance of the battery is very low in the milliOhm range and the alternator resistance is also low. Ohms law applies and the charging current equals the voltage difference divided by the combined resistance in the circuit.
I perhaps did not ask the right question, i see that some ammeters fitted to cars are 30 0 30 amps, and some are 60 0 60 amps, if i were to fit an aftermarket ammeter which should i have on my CLK 320?
Ammeters when they were fitted to cars were connected so that they measured all of the current flowing between the generator and battery except for that drawn by the starter motor. But that's a long time ago and ammeters mostly went out when dynamos were replaced by alternators.
If it's even possible on a modern car I'd fit the 60 0 60 version. Is this some sort of modern shunt type or clip on type ammeter or are you going to need seriously heavy gauge wiring like at least 6mm2. Even that size of cable would drop 0.175 volts/Metre at 60 Amps
Frankly a voltmeter infers much the same information about whether the battery is charging or not and it's much simpler and safer to wire in than the old fashioned type of ammeter.