The Villain-Lai-Das Sarma (VLDS) equation has received much attention in surface growth dynamics due to its effective description of molecular beam epitaxy (MBE) growth process. However, the scaling exponent of the VLDS equation driven by long-range correlated noise is still unclear, because different analytical approximation methods yield inconsistent results. The nonlinear term in the VLDS equation challenges the numerical simulation methods, which often leads to the problem of numerical divergence. In the existing numerical approaches, the exponential decay techniques are mainly used to replace nonlinear terms to alleviate the numerical divergence. However, recent studies have shown that these methods may change the scaling exponent and universality class of the growth system. Therefore, we propose a novel deep neural network-based method to address this problem in this work. First, we construct a fully convolutional neural network to characterize the deterministic terms in the VLDS equation. To train the neural network, we generate training data by using the traditional finite-difference method before numerical divergence occurs. Then, we train the neural network to represent the deterministic terms, and perform simulations of VLDS driven by long-range temporally and spatially correlated noises based on the neural networks. The simulation results demonstrate that the deep neural networks constructed here possess good numerical stability. It can obtain reliable scaling exponents of the VLDS equation driven by different uncorrelated noise and correlated noise. Furthermore, in this work, it is also found that the VLDS system driven by long-range correlated noise exhibits a mound-like morphology when the temporal correlation exponent is large enough, while the growing surface morphology driven by spatially correlated noise still presents a self-affine fractal structure, independent of the spatial correlation exponent.