In leave-one-out (LOO) validation, you train the model for each data point individually, leaving out one data point each time and using the remaining data for training. So, for a dataset of size 100000, you would need to train the model 100000 times.
In k-fold cross-validation, you divide the dataset into k folds and train the model k times, each time using a different fold as the validation set and the remaining folds for training. In this case, you mentioned 10-fold cross-validation, so you would need to train the model 10 times.
Now, to compare the number of models that need to be trained for LOO versus 10-fold cross-validation:
Number of models for LOO = 100000 Number of models for 10-fold cross-validation = 10
The ratio of the number of models for LOO to 10-fold cross-validation is:
10000010=1000010100000=10000
So, the number of models that need to be trained for LOO validation is 10,000 times greater than the number needed for 10-fold cross-validation.