Simulation Experiments in Practice: Statistical Design and Regression Analysis

Research output: Working paperDiscussion paperOther research output

227 Downloads (Pure)

Abstract

In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic theory assumes a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model’s I/O behaviour is assumed to have residuals with zero means. This article addresses the following questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied?
Original languageEnglish
Place of PublicationTilburg
PublisherOperations research
Number of pages22
Volume2007-09
Publication statusPublished - 2007

Publication series

NameCentER Discussion Paper
Volume2007-09

Fingerprint

regression analysis
simulation
experiment

Keywords

  • metamodels
  • experimental designs
  • generalized least squares
  • multivariate analysis
  • normality
  • jackknife
  • bootstrap
  • heteroscedasticity
  • common random numbers
  • validation

Cite this

Kleijnen, J. P. C. (2007). Simulation Experiments in Practice: Statistical Design and Regression Analysis. (CentER Discussion Paper; Vol. 2007-09). Tilburg: Operations research.
Kleijnen, J.P.C. / Simulation Experiments in Practice : Statistical Design and Regression Analysis. Tilburg : Operations research, 2007. (CentER Discussion Paper).
@techreport{66449de23d9c4d19a3e4ada1b703bcde,
title = "Simulation Experiments in Practice: Statistical Design and Regression Analysis",
abstract = "In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic theory assumes a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model’s I/O behaviour is assumed to have residuals with zero means. This article addresses the following questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied?",
keywords = "metamodels, experimental designs, generalized least squares, multivariate analysis, normality, jackknife, bootstrap, heteroscedasticity, common random numbers, validation",
author = "J.P.C. Kleijnen",
note = "Pagination: 22",
year = "2007",
language = "English",
volume = "2007-09",
series = "CentER Discussion Paper",
publisher = "Operations research",
type = "WorkingPaper",
institution = "Operations research",

}

Kleijnen, JPC 2007 'Simulation Experiments in Practice: Statistical Design and Regression Analysis' CentER Discussion Paper, vol. 2007-09, Operations research, Tilburg.

Simulation Experiments in Practice : Statistical Design and Regression Analysis. / Kleijnen, J.P.C.

Tilburg : Operations research, 2007. (CentER Discussion Paper; Vol. 2007-09).

Research output: Working paperDiscussion paperOther research output

TY - UNPB

T1 - Simulation Experiments in Practice

T2 - Statistical Design and Regression Analysis

AU - Kleijnen, J.P.C.

N1 - Pagination: 22

PY - 2007

Y1 - 2007

N2 - In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic theory assumes a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model’s I/O behaviour is assumed to have residuals with zero means. This article addresses the following questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied?

AB - In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic theory assumes a single simulation response that is normally and independently distributed with a constant variance; moreover, the regression (meta)model of the simulation model’s I/O behaviour is assumed to have residuals with zero means. This article addresses the following questions: (i) How realistic are these assumptions, in practice? (ii) How can these assumptions be tested? (iii) If assumptions are violated, can the simulation's I/O data be transformed such that the assumptions do hold? (iv) If not, which alternative statistical methods can then be applied?

KW - metamodels

KW - experimental designs

KW - generalized least squares

KW - multivariate analysis

KW - normality

KW - jackknife

KW - bootstrap

KW - heteroscedasticity

KW - common random numbers

KW - validation

M3 - Discussion paper

VL - 2007-09

T3 - CentER Discussion Paper

BT - Simulation Experiments in Practice

PB - Operations research

CY - Tilburg

ER -

Kleijnen JPC. Simulation Experiments in Practice: Statistical Design and Regression Analysis. Tilburg: Operations research. 2007. (CentER Discussion Paper).