The maximum charging power that can be used depends on two things:
1. the available power of the charger (and the mains supply that it comes from)
2. the maximum charge rate that the battery can accept (this varies with battery temperature, State of Charge (SoC) and State of Health (SoH).
Whichever is the lower kW value from (1) or (2) limits the charging power.
If you wanted a typical answer, then (1) usually limits the charge rate to 3kW or 7kW for an on-board charger, or anything up to around 50kW for an off-board charger.
The second factor (2 above) typically limits charge rate to around the 1C rate (=21kW in your example) up until the highest cell voltage reaches it's allowable limit. This often happens at around 80% SoC. To charge beyond this level, the charge rate has to be continually reduced and a constant charge voltage maintained.
Example-If you wanted to charge your 21kWh battery from 10% SoC to 100% SoC from a 7kW charge point, it would take (0.8-0.1)* 21kWh/7kW= 0.21 hours to get from 10% to 80% (limited by the charger), and then some more (limited by the battery cell characteristics) to get from 80% to 100%. To work out the time taken for this second phase, you need more details of the cells used in the battery (Open Circuit Voltage vs. SoC, DC Internal resistance vs. SoC, Maximum allowable cell voltage).
The only reliable way to "know" is to meter the incoming power. Even this information is partly misleading if the question is "how much energy is going to the battery?" as part of the energy consumed is required to supply the surrounding components (eg. cooling, control units). Manufacturer data may give hints though my guess is that real consumption might be higher.
In general, the lower-power chargers will require less total energy to achieve the same SOC. On the downside, the charging process will take significantly more time. Just like with the conventional cars: "If you want to go faster, be prepared to need more 'fuel'."