This study reports on the development of a neural network (NN) model for instantaneous and accurate estimation of solar radiation spectra and budgets geared toward satellite cloud data using a ≈2.4M record, high-spectral resolution look up table (LUT) generated with the radiative transfer model libRadtran. Two NN solvers, one for clear sky conditions dominated by aerosol and one for cloudy sky conditions, were trained on a normally-distributed, and multiparametric, uniformly-gridded subset of the LUT that spans a very broad class of atmospheric and meteorological conditions as inputs with corresponding global horizontal irradiance (GHI), direct normal irradiance (DNI), diffuse flux (DF) and actinic flux (AF) high resolution spectra as target outputs. The NN solvers were tested by feeding them with a large (10K record) ‘off-grid’ random subset of the LUT spanning the training data space, and then comparing simulated outputs with target values provided by the LUT. The NN solvers demonstrated a capability to interpolate accurately over the entire multiparametric space. Once trained, the NN solvers allow for high-speed estimation of solar radiation spectra with high spectral resolution (1nm) and for a quantification of the effect of aerosol and cloud optical parameters on the solar radiation budget without the need for a massive database. The cloudy sky NN solver was applied to high spatial resolution (54K pixel) cloud data extracted from the geostationary MSG3-SEVIRI imager and demonstrated that coherent maps of spectrally-integrated GHI at this resolution can be produced on the order of 1 minute.