学术报告(刘卫东、冯兴东 9.28)

1、Distributed stochastic gradient descent with diverging dimensions 2、Lack-of-fit tests for quantile regression models

发布人:周妍 发布日期:2018-09-14
主题
1、Distributed stochastic gradient descent with diverging dimensions 2、Lack-of-fit tests for quantile regression models
活动时间
-
活动地址
新数学楼415室
主讲人
刘卫东 教授、冯兴东 教授

摘 要:

  1. In this talk, we will investigate the statistical estimation error of stochastic gradient descent (SGD) method when the dimension goes to infinity. The results are then applied to distributed statistical estimation with divide-conquer SGD. In particular,we will show that to achieve the optimal estimation rates, a necessary condition for the number of machines $K$ is $K=O(\sqrt{n/p})$,where $n$ is the samples size and $p$ is the dimension. To avoid this strict condition, we will further introduce a stochastic approximate Newton-type method for distributed statistical estimation.

2、We novelly transforms lack-of-fit tests for parametric quantile regression models into checking the equality of two conditional distributions of covariates.We then borrow these successful test statistics from the rich literature of two-sample problems, and this gives us much flexibility in constructing a suitable lack-of-fit test according to our experiences on covariates.This finding is first demonstrated for the low dimensional data by using a practical two-sample test, which has a sound power for random vectors with a moderate dimension.We then apply it to the high dimensional data, and a lack-of-fit test for linear quantile regression models is thus constructed via combining two-sample test statistics in the literature.

The asymptotic distribution of the test statistic under the null hypothesis has an explicit form, and we hence can calculate the critical values or $p$-values directly.

The usefulness of these tests are illustrated by simulation experiments, and the real analysis gives further support.

欢迎广大师生参加!

 

华南统计科学研究中心

2018/9/11