This talk presents a new type of linear regression model with regularization, called TWT-LR-ETP. Each predictor
is conditionally truncated through the presence of unknown thresholds. The two-way truncated linear
regression model (TWT-LR), is not only viewed as a nonlinear generalization of a linear model but is also a
much more flexible model with greatly enhanced interpretability and applicability. The TWT-LR model
performs classifications through thresholds similar to the tree-based methods and conducts inferences that
are the same as the classical linear model on different segments. In addition, the innovative penalization,
called the extremely thresholding penalty (ETP), is applied to thresholds. The ETP is independent of the values
of regression coefficients and does not require any normalizations of regressors. The TWT-LR-ETP model
detects thresholds at a wide range, including the two extreme ends where data are sparse. Under suitable
conditions, both the estimators for coefficients and thresholds are consistent, with the convergence rate for
threshold estimators being n. Furthermore, the estimators for coefficients are asymptotically normal for fixed
dimension. It is demonstrated in simulations and real data analyses that the TWT-LR-ETP model illustrates
various threshold features and provides better estimation and prediction results than existing models.