according to the basic theory that Watts = Volts * Amps, the nominal value would be 360Watts = 12 * 30 (assumes you are running a 12v system).
However things are never exactly the nominated values, you will typically charge at a higher voltage (say 13.8v), thus you could say that you could handle 414 Watts.
Then you should consider that solar panels are never 100% efficient, thus I would not be uncomfortable connecting say 450W to a 30w charger.
We have 480W of solar and I've seen 25 amps delivered quite often in hot weather (when 12V fridge working hard) and 28 amps max delivered via our 30 amp regulator.
Experts tell me that it is most efficient to match panel output (amps) to regulator (amps rating) as close as possible.
__________________
Why is it so? Professor Julius Sumner Miller, a profound influence on my life, who explained science to us on TV in the 60's.
Go to higher voltage from panels and have lower amperage
As Plendo noted above, Watts = Volts * Amps
Conversely, Amps = Watts/Volts
In our case, we can harvest at up to 1400 W (bright sunny day at noon on day of the Solstice). Our system is two series of 3 x 235 W panels. Each is series is thus 700 W at 90 V. These two series are then set in parallel to Tri-Star MPPT-45 (45 Amps). 1400 W at 90 V is only 15.5 amps. If we were to run 12 V panels in parallel at 1400 W, this would be 116 amps.
The MPPT-45 then converts 90 V to 54 V so that the MPPT is now supplying 1400 W at 26 amps to the battery bank (48 V nominal). Of course, the 1400 W is a few percent lower after going through the MPPT.
US caravans use 12 V nominal DC so that we have a MeanWell 508 W 48 V to 12 V power supply. The battery suite is also connected to a Magnum 4.0 kW PSWI for AC requirements.