Scalable Subsampling Inference for Deep Neural Networks (Working)

We propose a scalable subsampling method on estimating standard fully connected DNN. It turns out that the mean square error bound of estimation based on DNN to a target regression function can be improved under mild conditions. Moreover, it can run faster than training a single DNN on the whole dataset. In addition, we propose various methods to estimate the bias order of DNN, build confidence and prediction intervals based on DNN.

June 2024 · Kejin Wu, Dimitris Politis