Date of Award

12-2015

Document Type

Dissertation

Degree Name

Doctor of Philosophy (PhD)

Legacy Department

Mathematical Science

Committee Chair/Advisor

Sun, Xiaoqian

Committee Member

Brown, Derek

Committee Member

Gallagher, Colin

Committee Member

Li, Yingbo

Abstract

The dissertation consists of two distinct but related research projects. First of all, we study the Bayesian analysis on the two-piece location-scale models, which contain several well-known sub-distributions, such as the asymmetric Laplace distribution, the skewed normal distribution, and the skewed Student-t distribution. The use of two-piece location-scale models is an attractive method to model non-symmetric data. From a practical point of view, a prior with some objective information may be more reasonable due to the lack of prior information in many applied situations. It has been shown that several common used objective priors, such as the Jeffreys prior, result in improper posterior distributions for the case of two-piece location-scale models. This motivates us to consider alternative priors. Specifically, we develop reference priors with partial information which lead to proper posterior distributions. Based on those priors, we extend our prior to a general class of priors. A sufficient and necessary condition is provided to ensure the propriety of the posterior distribution under such general priors. Our results show that the proposed Bayesian approach outperforms the frequentist method in terms of mean squared error. It is noteworthy that the proposed Bayesian method can be applied to the quantile regression due to the close relationship between the asymmetric Laplace distribution and the quantile regression. The second project deals with the Bayesian variable selection for the maximum entropy quantile regression. Quantile regression has gained increasing popularity in many areas as it provides richer information than the regular mean regression, and variable selection plays an important role in quantile regression model building process, as it can improve the prediction accuracy by choosing an appropriate subset of regression predictors. Most existing methods in quantile regression consider quantile at some fixed value. However, if our purpose is, among all the fitted quantile regression models, to identify which one fits the data best, then the traditional quantile regression may not be appropriate. Therefore, we consider the quantile as an unknown parameter and estimate it jointly with other regression coefficients. In particular, we consider the maximum entropy quantile regression whose error distribution is obtained by maximizing Shannon's entropy measure subject to two moment constraints. We apply the Bayesian adaptive Lasso to the model and put a flat prior on the quantile parameter due to the lack of information on it. Our proposed method not only addresses the problem about which quantile would be the most probable one among all the candidates, but also reflects the inner relationship of the data through the estimated quantile. We develop an efficient Gibbs sampler algorithm and show that the results of our proposed method are better than the ones under the Bayesian Lasso and Bayesian adaptive Lasso with fixed quantile values through both simulation studies and real data analysis.

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.