Yeh, Chi-Kuang2018-07-232018-07-2320182018-07-23http://hdl.handle.net/1828/9765In this thesis, we first review the current development of optimal regression designs under the second-order least squares estimator in the literature. The criteria include A- and D-optimality. We then introduce a new formulation of A-optimality criterion so the result can be extended to c-optimality which has not been studied before. Following Kiefer's equivalence results, we derive the optimality conditions for A-, c- and D-optimal designs under the second-order least squares estimator. In addition, we study the number of support points for various regression models including Peleg models, trigonometric models, regular and fractional polynomial models. A generalized scale invariance property for D-optimal designs is also explored. Furthermore, we discuss one computing algorithm to find optimal designs numerically. Several interesting applications are presented and related MATLAB code are provided in the thesis.enAvailable to the World Wide WebOptimal designStatisticsSecond-order least squares estimatorConvex optimizationGeneralized scale invarianceNumber of support pointsOptimal regression design under second-order least squares estimator: theory, algorithm and applicationsThesis