advantageous to build local regression models than a global model. Additionally, we propose a local feature selection via regularization. The empirical research carried out on three data sets from real estate market confirmed the effectiveness of this approach. We paid special attention to the model quality assessment using cross-validation for estimation of the residual standard error.
Statistics in Transition New Series , ISSUE 3, 515–524
Methods of de-noising the output signal of the JSD-I/A quartz flexural accelerometer based on five types of multiwavelets are comparatively investigated in this paper. Firstly, the theory of multiwavelet transform and the generalized cross validation criterion are analyzed. Secondly, because the JSD-I/A quartz flexural accelerometer which is fixed in SCT-1 two-axis rotation platform by the appropriative clamp has a start-up procedure of 3 minutes, the output signal of the quartz flexural
International Journal on Smart Sensing and Intelligent Systems , ISSUE 1, 191–209
Statistics in Transition New Series , ISSUE 4, 237–253
to the errors, giving a robust estimation of regression coefficients (37, 41).
After fitting the model, we used exact leave-one-out cross-validation (LOO-CV), leaving out one patient at a time, to compare the model to three less complex models, repeating this across all four informant perspectives on conflict. LOO-CV estimates the expected log posterior density (ELPD), indicating how well the model is expected to fit new data from the same distribution (42). The first of these three models had
Erling W. Rognli,
Nikolai O. Czajkowski,
Scandinavian Journal of Child and Adolescent Psychiatry and Psychology , 110–122
In this paper, Self-adaptive Differential Evolutionary Extreme Learning Machine (SaDE-ELM) was proposed as a new class of learning algorithm for single-hidden layer feed forward neural network (SLFN). In order to achieve good generalization performance, SaDE-ELM calculates the error on a subset of testing data for parameter optimization. Since SaDE-ELM employs extra data for validation to avoid the over fitting problem, more samples are needed for model training. In this paper, the cross
International Journal of Advanced Network, Monitoring and Controls , ISSUE 3, 72–77
complexity are regulated using K-fold cross-validation and early stopping technique. The soft sensor is validated using actual well test data from producing wells, and model performance is analyzed using cumulative deviation and cumulative flow plots. The developed soft sensor shows promising performance with a mean absolute percent error of around 4% and less than 10% deviation for 90% of the samples.
Tareq Aziz AL-Qutami,
Mohd Azmin Ishak
International Journal on Smart Sensing and Intelligent Systems , ISSUE 1, 199–222
for the training module. DataSetE is used as the data set to be tested. Three support vector machines are constructed here, namely SVM1, SVM2, and SVM3. After training the classifiers SVM1, SVM2, and SVM3, DataSetE was used as the test sample data set, and experimental results were obtained through the SVM classifier.
Finding optimal parameters
The algorithm based on the cross-validation idea is used to select an optimal parameter value C for the RBF kernel function and optimal parameters C
International Journal of Advanced Network, Monitoring and Controls , ISSUE 3, 53–60
expected value, but also the variance to determine the uncertainty of the prediction, or we can use the entire distribution for, e.g. more accurate further calculations or generating random values. 10-fold cross-validation log-likelihood tests were conducted for 22 DAX companies, leading to very accurate predictions, especially when individual models were used for each company, as significant differences were found between their behaviours. An additional advantage of using this methodology is that it
Statistics in Transition New Series , ISSUE 5, 99–118