您好,欢迎来到意榕旅游网。
搜索
您的当前位置:首页permission.

permission.

来源:意榕旅游网
AIAA-2000-4803

MODELING AND SIMULATION UNCERTAINTY IN MULTIDISCIPLINARY

DESIGN OPTIMIZATION

Stephen M. Batill†John E. Renaud ‡Xiaoyu Gu ††

Department of Aerospace and Mechanical Engineering

University of Notre DameNotre Dame, Indiana

ABSTRACTThis paper is intended to contribute to the ongoingdiscussion of selected concepts related to the topic oftechnical risk or uncertainty in the model-based designof physical artifacts. The paper focuses on the use ofanalytic models and numerical simulation in themultidisciplinary design optimization process. Itconsiders how issues of physical process variability,information uncertainty and the use of models andsimulations influence the design decision process. Thispaper only qualitatively addresses these issues but thegoal is to provide a focus for discussion of conceptsassociated with information uncertainty as applied tomodel-based multidisciplinary design and optimization.

INTRODUCTIONConsider first some issues and concepts related to thephysical system or artifact being designed. The artifactcan be defined, using the concepts introduced byHerbert Simon1, in terms of its inner and outerenvironment. “An artifact can be thought of as ameeting point – an “interface” in today’s terms –between an “inner” environment, the substance andorganization of the artifact itself, and an “outer”environment, the surroundings in which it operates”.The engineer is tasked with establishing a completedefinition of the design and in many cases,

manufacturing details (i.e. the inner environment) thatcan cope with the outer environment in order to achievea predetermined set of goals. Many of the issues thatare often generically referred to as “uncertainty” arerelated to the ability of the artifact to achieve those

goals and are due to characteristics associated with boththe inner and outer environments. The process of design

Associate Dean and Professor, Associate Fellow AIAA‡ Associate Professor, Associate Fellow AIAA††

Graduate Research Assistant, Student Member AIAACopyright 2000 by S. Batill. Published by the AmericanInstitute of Aeronautics and Astronautics, Inc. withpermission.

is associated with the engineer making decisions basedupon information – from a variety of sources – relatedto this interface.

The following paper addresses a number of issuesassociated with uncertainty as related to physicalartifacts. The basic distinction is made between theartifact and sources of uncertainty as viewed in the“real world” versus the role of uncertainty in analysis,modeling and simulation. Both these issues influencethe decisions made during the system design process.Many of the issues addressed below have resulted froma variety of research efforts related to the developmentand application of methods for multidisciplinary designoptimization. Some of the concepts presented belowhave also been adapted from the considerable body ofresearch related to experimental uncertainty.2-5 Therecent work by Oberkampf, et.al 6 has also served toprovide some important additional concepts andterminology. Though Reference 6 addresses theestimation of “total uncertainty” in modeling andsimulation, and that is just one of a number of issuesrelated to the role of uncertainty in multidisciplinarydesign optimization, the careful and detailed frameworkit provides is an important contribution to this area. Thecurrent paper does not provide a review of literature inthis growing field but does include a selectedBibliography that may assist those interested inpursuing related efforts.

An important issue associated with the subject topic ofthis paper is the confusion and ambiguity associatedwith many of the terms used herein. Unfortunately,there are a set of terms that can assume both general“secular” definitions and more specific “technical”

definitions. These include uncertainty, error, variability,

1

American Institute of Aeronautics and Astronautics

bias, precision, and tolerance. In this paper we hope thatthe particular context will help the reader todifferentiate between the secular and technicaldefinitions. In many cases it was found that the

terminology proposed by Oberkampf, et.al.6 was veryeffective in expressing the issues of concern and wasadopted, or adapted, whenever possible.

UNCERTAINTY IN THE REAL WORLDIn order to describe how “uncertainty” (and all thoseconcepts associated with it, e.g. risk, robustness,quality, …) influences the engineering process ofsystem design, consider the variability that exists inphysical artifacts and in the way in which the artifactsinteract with their outer environments. In order to studythe perceived, and real, variations that occur in physicalartifacts, one needs to be careful how these

observations are made, described and interpreted.First any measurable characteristic of a physical artifactis always subject to the uncertainty in the measurementprocess. As a simple example in this discussion

consider a particular model/make of an automobile anda small set of characteristic “states” of the automobilethat may be of interest to a designer (and a customer).One characteristic is weight and the other is time-to-accelerate to 60 mph. Both of these characteristics canbe measured for a single vehicle on repeated trials orfor a group of vehicles using either single

measurements or repeated trials. The differences in themeasured values of these two characteristics woulddepend upon a variety of factors.

a) Variations in the measurement process due to

differences in technique, equipment or othersources (these would occur even if the

physical artifact/s involved were absolutely“identical” and the measurement events (i.e.the acceleration test) were performed the exactsame way. This measurement uncertainty is atopic of study in itself and for the remainder ofthis discussion, it will be assumed that thissource of uncertainty has been eliminated.b) Variations in the “outer” environment during

the measurement. These may be due todifferences in wind direction or road

conditions during the acceleration test. Onewould typically like to eliminate these as partof the measurement process but they will bepresent in all cases. Due to changes that occurin time, no two “measurements” on the sameor similar artifacts are carried out under theexact same conditions.

c) As in case b) where the outer environment will

continuously change, there will occur

continuous change in the inner environment.The same artifact is not the “same” for twodifferent measurements. One might believethat the vehicle weight would not changesignificantly (measurably) from one weightmeasurement to the next but in reality it would(albeit very small, e.g. consumption and

evaporation of fuel, accumulation of reside). Ifa variety of acceleration tests are conductedwithout changing tires, the effect onacceleration times could be much morepronounced.

d) Variations associated with tests conducted on

different realizations of the same product.These represent variations in the “inner

environment” between different realizations ofnominally the same artifact. No two cars arecreated exactly the same and this is one of thefactors of significant interest in the designprocess.From the perspective of the designer, item d) above isoften considered to be the issue of primary concern.What are the primary sources of these variations in theinner environment? Is it possible to reduce the sourcesof variation in the artifact’s performance that areassociated with the “inner environment” through

decisions made in the design process? The designer isalso often concerned with how to reduce the sensitivityof the system’s behavior or performance to variations inthe outer environment. Is it possible to design thesystems so that it is less sensitive to the outerenvironment variability?

Consider if a particular characteristic of an artifact ismeasured using a sample of 32 different realizations ofthe “same” artifact. Figure 1, shown below, might berepresentative of the results of such a series of

“measurements”. As shown in the figure, not all of theresults were the same. These results can be describedusing a variety of statistical techniques and one usuallyattempts to present this stochastic information in termsof a mean or nominal value and some measure of thevariation. The mean and the variation are

“characteristics” of the artifact associated with both itsinner environment and its outer environment.

It is both the mean and variability that the designerwould like to be able to predict, and/or control, as partof the design process. As will be emphasized below, theinformation available in this figure is only availableafter the artifact has been realized and, in most cases,that is “too late” to influence the design process.Developing techniques to develop accurate a prioriestimates of these characteristics is an important goal inmodel-based design.

2

American Institute of Aeronautics and Astronautics

20181614121082010

11

12

13

14

15

16

17

18

19

Characteristic state

the design throughout the design process. The

identification and quantification of this set of designvariables are central to this process. It is this

quantitative description of the artifact, based uponinformation developed using numerical models orsimulation, that is the focus of this discussion. Thoughthere is great interest in providing quantitative

descriptions as early in the design process as possible,this depends upon the availability and the level of

development of models and analysis methods related tothe class of artifacts being considered.

As the level of abstraction of the artifact changes andmore and more detail is required to define it, thenumber of design variables will grow considerably.Design variables typically are associated with the typeof material used in the artifact and geometric

description of the form that the material assumes as partof the artifact. The engineer can select materials thatpossess desirable qualities and then attempt to “shape”them into useful forms to achieve a given purpose.Eventually the engineer will be required to specify (i.e.quantify) each of the design variables in the mostrefined representation of the artifact. This descriptionoften takes the form of a “detailed engineeringdrawing” that includes materials information, all

necessary geometric information needed for fabricationincluding manufacturing tolerances.

Decisions associated with quantifying (or selecting) thedesign variables are usually based upon an assessmentof a set of behavioral variables, also referred to hereinas system states. The behavioral variables or systemstates are used to describe the artifact’s characteristics.The list of these characteristics also increases in detailas the level of abstraction of the artifact decreases. Thedesigner uses the behavioral variables to assess thesuitability of the artifact and these variables representthe engineer’s attempt to predict the future. They arebased upon information about the artifact gained from avariety of sources.

There are two primary sources of information availableto engineers during the design process - both of whichare used in most cases. These two sources are: 1)archived experience, and 2) engineering analysis,modeling and simulation.

Engineers often gather experiential information fromempirical data or knowledge bases. Interpolating orextrapolating from information on similar designconcepts can also help provide the designer with theconfidence to make a decision based upon the successof earlier, similar designs. Often this type of

information is incorporated into rules-of-thumb, designhandbooks or design guidelines. If an engineer wishes

Figure 1. Sample population of a characteristic state for a“real” system

UNCERTAINTY FROM THE SYSTEMSDESIGNER’S PERSPECTIVEFrom the designer’s perspective the artifact exists onlyas an abstraction. The description of the artifact isalways incomplete and any information related to theartifact’s characteristics or behavior is approximateprior to its physical realization. Dealing with thisincomplete description of the artifact and the

approximate nature of the information associated withits characteristics and behavior are key issues in thedesign process. The following discussion does notaddress uncertainty in the outer environment and itsinfluence on the designer. This too is a critical issue andworthy of considerable attention.

It is difficult to effectively generalize the design processdue to the great diversity in the manner in which theprocess is carried out and the resulting artifacts. Thereare differences between the evolutionary design of asimple component and the revolutionary design of acomplete complex system. The current discussion,though potentially applicable over this wide range, isintended to focus on the emerging process of model-based design using the extensive capabilities nowavailable with modeling, simulation and digital

computing. The discussion will focus on how engineersdevelop and interpret the information that is used toguide and validate decisions made during the designprocess.

In the following one should assume that the basicconcept for an artifact has been established. With theselection of this basic concept the engineer also has toidentify (though not necessarily quantify) a finite set of“design variables” that will eventually be used to

uniquely specify the design. (Thus the current focus isonly one small step in the overall system design

process.) This set of design variables will evolve with

Frequency of occurance3

American Institute of Aeronautics and Astronautics

to include a bolted connection in a mechanicalcomponent, rarely would she design the bolt from

scratch. The bolts would be selected from a finite set ofthose available and the selection would be based uponthe characteristics of the bolts as provided by the “boltmanufacturer”. It would be the designer’s role toestimate the loads, temperatures and other conditionsthat would be imposed on the bolts and then to selectthe bolts that would meet those requirements. Theselection decision would most likely allow for certainfactors of safety or other means to include theinformation uncertainty in the process. The use ofempirical information requires the designer to makenumerous assumptions concerning the suitability of theavailable information and its applicability to the currentsituation. These assumptions add to the uncertainty inthe design process.

There are also many decisions made in the designprocess that are based upon individual or corporateexperience that is not formally archived in a database.This type of information is very valuable in the designof artifacts that are perturbations (evolutionary design)of existing successful designs but has severe limitationswhen considering the design of new or revolutionarydesigns. Many organizations have or are currentlytrying to archive this type of experience in a formalmanner. Though this may be very useful information,quantifying the uncertainty associated with it, in a waythat will assist in quantifying the risk associated withthe entire product, is usually not possible.

The second type of information available to thedesigner is that based upon analysis, modeling andsimulation. As engineering systems become morecomplex and greater demands are placed upon artifactperformance, time-to-market and cost, this source ofinformation will become even more important in thedesign process. With the explosive growth in digitalcomputing and in particular with the development ofmany model-based tools such as FEA for stress,deformation and thermal environment prediction andCFD for fluid-structure interactions, the opportunity toperform complex numerical simulations using desktopcomputers is a possibility for many engineering

designers. It should be emphasized that the informationprovided by these models or simulations carries with itsome level of uncertainty and the use of that

information introduces a level of risk to the decisionsmade based upon this information. Quantifying thatuncertainty and understanding the role it plays in theproduct design process is an issue attracting muchdiscussion.

MULTIDISCIPLINARY SYSTEM MODELING ANDANALYSISTo introduce the idea of model based design for amultidisciplinary system (and there are very fewsystems that are not inherently multidisciplinary) thefollowing discussion will use a number of conceptsadapted from terminology used in multidisciplinarydesign optimization, MDO. In this context the processused to develop the quantitative description of theartifact using numerical modeling and simulation isreferred to as a System Analysis, SA. The SA for acoupled, non-hierarchic system is illustratedschematically below using a NxN or N2 diagram.

CA1{x}CA2CA3{y}Figure 2. Coupled System Analysis, SA

The representation in Figure 2 assumes that a numberof different analytic or numerical models are used torepresent the artifact and its behavior at a given level ofabstraction. These individual models or analyses arereferred to as Contributing Analyses, CAs. At a givenlevel of abstraction the artifact is defined using a vectorof design variables, {x}. The behavioral characteristics,or system states, of the artifact at the correspondinglevel of abstraction are expressed in a vector of states,{y}. The purpose of the SA is to provide a means ofcomputing {y} for a given vector {x}.

Each individual CAs typically require a subset {xi} ofthe total system design variable vector, {x}, and is usedto compute a number of the elements {yi} of the vectorof system states {y}. In the case of the completelycoupled system shown in the diagram above, each ofthe CAs are the sources of information that are neededin another one of the CAs, as indicated by the arrows.The arrows illustrate that information is both fed-forward and fed-back during the execution of the SA ofa complex, multidisciplinary system. This requires thatthis process is iterative in nature. The formulation of thesystem analysis is quite complex and the system

analysis itself evolves along with the artifact during thedesign process.

Early in the process when the representation of theartifact is very abstract, the individual CAs may be

4

American Institute of Aeronautics and Astronautics

simple “textbook” analyses but as the process

continues, the individual CAs can become much moredetailed and time-consuming such as the FEA or CFDanalyses suggested above. Thus the time necessary toperform the SA depends upon the detail of the models,the complexity of the analyses and convergence

characteristics of the iterative process associated withSA.

Reference 6 presents a much “finer grain” discussion ofissues related to either an individual CA or the

complete SA. It proposes that the development of a SAor CA is composed of a number of distinct phases,these are:

1) conceptual modeling of the physical system,2) mathematical modeling of the conceptual model,3) discretization and algorithm selection for themathematical model,

4) computer programming of the discrete model,

5) numerical solution of the computer program model,6) representation of the numerical solution.

Each of these phases introduces issues of variation,uncertainty and error (special terms discussed withsome detail later) into the process. The final statementin Reference 6 stresses the complexity and importanceof these issues,

“..this formal recognition of the sources of

nondeterminism and error shows the compoundingeffect and rapid growth of each source through themodeling and simulation process. Some have referredto this growth of total uncertainty through the

modeling and simulation process as “overwhelming.”However, it is an issue that must be faced by analystsand decision makers who use the results of modelingand simulation.”

It may be impractical to pursue the detailed processsuggested in Reference 6 for all model-based designactivities but it is critical to realize that the issues

introduced therein must be addressed in many cases andits importance will continue to be increased.

Returning to the simple representation of the systemanalysis discussed earlier, the schematic representationshown in Figure 2 actually lacks an important featurethat has been added to Figure 3 below.

{p}CA1{x}CA2CA3{y}Figure 3. System Analysis augmented with inputparameters

As illustrated there is additional information that istypically required to perform the system analysis. Thisinformation is represented in the vector {p}, parametervector. The elements of the parameter vector generallyrepresent two types of information. The first type isinformation related to the outer environment in whichthe artifact will exist and/or operate. This is oftenoutside of the direct “control” of the designer but isinformation necessary to perform the analysescontained within the SA. The second type is

information about the artifact that is not yet availabledue to the level of abstraction or model detailrepresented in the SA but is required in order to

perform some calculation within the SA. For examplethe weight of a particular “out-sourced” component thatwill be selected at a later point in the design process butis needed to predict vibration response. Thus someinformation must be assumed related to certain featuresof the inner environment at different stages in thedesign process and this can be included in {p}.Given a specific design as represented by the designvector, {x}, and a set of parameters, {p}, one canperform the system analysis in order to develop thecorresponding vector {y}. This process involvesmaking calculations within individual CAs andexchanging information between CAs until

convergence to a consistent solution is achieved. Theresulting vector {y} will completely describe the

artifact, at least at the level of detail provided by all theCAs included in the SA. It should be noted that in somesituations the SA may also include certain “designrules” and thus the distinction between analysis anddesign can become somewhat clouded. The issue ofconvergence of the SA so that a given {x} and {p} willyield a unique {y} is an issue for further discussion butthe existence of a unique solution to the SA is assumedat this point.

At least for the current discussion one should assumethat the SA is primarily intended to provide informationthat will be used by the engineer to make design

5

American Institute of Aeronautics and Astronautics

decisions. These decisions usually involve altering thedesign vector {x} in order to meet certain constraints onthe selected system states. This typically means

achieving some level of desired “performance” as wellas avoiding certain types of failure. How this isaccomplished is up to the designer/s and dependingupon the complexity of the system this can be a verychallenging task. The MDO discipline has as one of itsgoals the rational automation of this decision process ina manner such that not only are feasible designsachieved (i.e. ones that meet all requirements orconstraints) but optimum designs are identified.Returning to the question of uncertainty, the designer istasked with determining how well the SA represents the“real world” and assessing the risk associated withmaking decisions based upon the information providedby the SA. There are a number of important questionsto be asked in this regard.

What is the uncertainty associated with the models andtechniques used in the SA itself?

How can one predict the inevitable variability, or

probabilistic behavior, that the artifact will demonstratewhen actualized?

Can the designer predict and compensate for the

stochastic nature of the artifact’s behavior in the designof the artifact?

The remaining discussion will attempt to briefly

address the first two questions. The third is an ongoingchallenge for future research. Recall that it is both themean characteristics and variability of the actual artifactthat the designer would like to be able to predict priorto “fabricating” the artifact. The designer would like tobe able to accurately predict the nominal value of eachof the states used to define the characteristics of theartifact as well as provide a quantitative

characterization of the stochastic behavior of thesecharacteristics.

The answer to first question, “What is the uncertaintyassociated with the SA itself?” depends upon a numberof factors, many of which have been detailed inReference 6 from the perspective of the analyst. Thefollowing attempts to address this question from theperspective of the designer as their perspective may notbe exactly the same, in that the designer will eventuallybe required to make decisions based upon the thisinformation. Consider a situation in which the level ofabstraction of the artifact is defined, thus the individualCAs are identified and the elements of {x}, {p} and{y} are also defined. By selecting specific values for allof the elements in {x} and {p} one can usually

uniquely define a corresponding set of states {y}

(neglecting systems that demonstrate random or chaoticbehavior and thus state prediction takes on a differentmeaning). Normally, if this process is repeated with thesame values for {x} and {p}, it would yield the same{y}.

Consider the “real world” situation discussed earlier. Asystem analysis could be repeated twenty times and thisdeterministic analysis procedure, using mean values ofall uncertain design variables and parameters, wouldyield the same result in each case. If the result for theselected characteristic state is included on the histogrampresented earlier, it would yield the result shown inFigure 4.

2520SA ResultFrequency of occurance15Real World105010111213141516171819Characteristic state

Figure 4. Comparison of SA result to artifact responseIn this case all the SAs provide the same result - asexpected. If the system analysis contained CAs thatincluded some “random” characteristics such as curvefitting schemes that are initiated from random numbersor other numerical processes with inherent randomcharacteristics, it is possible that the SA itself couldyield different results for the same {x} and {p}. If thatis the case, the characteristic state may not be the samefor each SA performed and that would need to be

considered as another source of SA uncertainty but thatsituation is not considered here. The difference betweenthe mean value of the characteristic state of the “real”system and the value resulting from the SA is referredto herein as a bias error.Invoking the concepts introduced in Reference 6, biaserror would be a combined effect of “uncertainty” andan “acknowledged error”. Uncertainty was defined asresulting from a lack of knowledge or incompleteinformation. Uncertainty as defined in Reference 6 isnormally not stochastic (in contrast to their concept ofvariability which is inherently stochastic) and for aspecific application of an SA, it would most likely beconstant. An acknowledged error is a “recognizabledeficiency” that is “not due to a lack of knowledge” and

6

American Institute of Aeronautics and Astronautics

it is usually reproducible and deterministic. In thecurrent context bias error would be one recognized bythe analyst (though not necessarily easily quantifiable)and introduced by the modeling or simulation process.The bias error should be differentiated from an

unacknowledged error such as a “mistake” and theseare not considered herein, though they can be

significant in particular in the complex software oftenassociated with design optimization.

Bias errors result from the inherent limitationsassociated with the analytic models and numerical

methods used in the SA. The magnitude and “direction”of this error is the cumulative result of many

assumptions associated with the individual CAs withinthe SA. The magnitude of the bias error would mostlikely change if the SA was performed for another“design”, in other words if an alternative design (i.e.different values of {x}) was considered. Since the

assumptions and limitations in the analysis and modelsmost likely depend upon characteristics of the design,the bias error would change (e.g. the applicability of thethin walled assumption in a stress calculation or thevalue assumed for the thermal conductivity of amaterial.) Though the existence of the bias error isrecognized, its value is not known a priori and it canonly be altered by modifying the model or thesimulation process.

In the coupled system analysis it is important to realizethat the effect of bias errors in individual CAspropagates through the entire SA. If CA2 requiresinformation developed in CA1, bias errors in statescomputed in CA1 will influence the calculations inCA2. Thus if one changes a model or analysis

procedure within one CA, it will influence the results ofother CAs to which it provides state information andthus influence the overall result from the SA. Thepropagation of bias error through the SA is an issue ofconcern. It should be stressed that one never actuallyknows the bias error until the artifact has been realizedand unless a particular phenomena is encountered inactual operation, the presence of a bias error may neverbe known. Thus predicting the bias error during theproduct design process requires the designer to makeassumptions about the magnitude and “direction” of thebias errors for all of the state information that willinfluence the design. Bias errors can be influenced bychanging the model (e.g. including more or more

appropriate finite elements in a stress calculation) or bychanging the method of analysis (e.g. using FEManalysis in lieu of simple beam-bending formula).Considerable engineering judgement and experiencecan be required in order to quantify bias errors for a CAwithin a SA. One might assume that in the future there

will be a need to provide bias error estimates as part ofthe system analysis.

The next question posed above “How can one predictthe inevitable “random”, or stochastic behavior, that theartifact will demonstrate when actualized?” has

received considerable attention in the recent past. Themost straightforward way to begin to understand theinfluence of the variability using the system analysis isillustrated schematically below using the same N2diagram used earlier.

{p + 󰀁p}CA1{x + 󰀁x}CA2CA3{y + 󰀁y}

Figure 5. System analysis with input and outputvariability

In this case it is recognized that some or all of thedesign variables {x} or parameters {p} could haveassociated with them some variation or “variablility”(󰀁) from a mean value6. Changing some or all of theelements in these vectors will result in changes in theresulting states {y}. If one were to perform a MonteCarlo simulation in which variations in the designvariables and parameters are selected from an

appropriately selected population of random numbers,the resulting state information could be represented on ahistogram as illustrated below in Figure 6.

2520Frequency of occuranceReal worldModel AModel B15105010

11

12

13

14

15

16

17

18

19

Characteristic state

Figure 6. Characteristic state information resulting frommultiple system analyses and two different modelsIn this case three “sets” of information have been

presented. The “real world” distribution is the same as

7

American Institute of Aeronautics and Astronautics

that presented earlier. The two other sets are used torepresent two different SAs that differ in the detail ortype of models used. The variations associated withModel A and Model B result from the use of randomvariations about the nominal mean value of designvariables and parameters and repeated system analyses.In the case of Model A there appears to be a negativebias error and for Model B a positive bias error. Neithermodel has been particularly effective in predictingeither the mean value or the variation in this particularcharacteristic or state. It must be emphasized that inreality the designer never knows the “real world”information before the design has been realized. Thusthe designer would be faced with selecting betweenModel A and Model B and the somewhat sparseinformation provided with each.

The variation about the mean value of the characteristicstate for each model results from the perturbationsabout the mean values of the design variables used inthe SA. It is an attempt on the part of the designer toquantify the “variability” that is “the inherent variationassociated with the physical system or the environmentunder consideration.6” This variability is stochastic innature and is often characterized by a probability orfrequency distribution if adequate samples areavailable.

The magnitude and character of the variations in thestate will depend upon the form of the assumedprobability distribution for the design variables andparameters. In this coupled SA, random variations in asingle design variable or parameter will result in

random variations in many of the system states as thesevariations can influence the states computed in a singleCA but those states are then used as input into otherCAs. The propagation of random variations in the

design variables, parameters and states through a SA isalso an area of current research interest. It should benoted that the overall distribution in the predictedsystem behavior as defined by the SA includes bothbias errors and random variability. Discriminatingbetween the sources and influence of each is animportant design consideration. Thus the “total

uncertainty” associated with the information providedby the system analysis, and required by the designer, isa combination of numerous sources and is not a single“number.”

As indicated above one might consider trying to

develop additional insight into the random variability(mean and variation for system states) via repeated SAsusing a Monte Carlo approach (or even analytic

methods exist for the simplest of systems). This may befeasible if the computing time required to perform theSA is small and the number of design variables and

parameters are also relatively small. As the designmatures and the level of abstraction changes, the SAbecomes more complex and includes models or

analyses not well suited to this approach. Quantifyingthe total uncertainty will require new approaches toprovide the designer with the information on both theexpected mean value and variability. Of particularinterest will be approaches that do not require theextensive numbers of “samples” usually needed forconventional statistical methods of random dataanalysis.

In order to develop the information necessary to predicta system’s total uncertainty it appears as if the designermust at least:

1. Given the level of abstraction of the artifact at a

given point in the design process, select analysismethods and appropriate fidelity for the models.2. Define the design variables {x}, system parameters

{p} and system states {y} necessary for the systemanalysis. This requires establishing the inner andouter environments for the system.3. Estimate, based upon other sources, the bias errors

associated with the individual CAs that makeup theSA. The variation in the bias errors across thedesign space must be considered. Those elementsof the bias errors associated with lack of

knowledge or incomplete information may be themost problematic.4. Estimate statistical characteristics (i.e. variance,

probability distribution, etc.) of the elements of thedesign variable vector {x} and the systemparameters {p}.Once these steps have been performed, one can thenconsider how to estimate the bias errors and the

variability associated with the system states. The totaluncertainty in the predicted characteristics of theartifact (i.e. the information that will subsequently beused to make design decisions) will then be a

combination of this information. It must be emphasizedthat, unlike the simple examples given above, the “realworld” results, actual system realizations, are neverknown during the design process. Thus it is only thenominal value of the system states, as predicted by theSA, and the quantified estimate of the total uncertaintythat the designer will have to estimate the expectedrange of behavior and the probability for that to occur.Since the total uncertainty information provides thisrange it will be an important factor in helping toquantify risk and influence the design.

8

American Institute of Aeronautics and Astronautics

ONGOING CHALLENGES FOR MDOThe final consideration in this discussion is related tohow this total uncertainty information will eventuallyinfluence the design decision process. As with thepreceeding discussion, there is no attempt to offersolutions, but the goal is to highlight issues associatedwith applying these ideas to multidisciplinary designoptimization. Consider the simple “design problem”illustrated graphically below in Figure 7. In this casethere are only two elements of the design variablevector {x} and no system parameters. There are threeelements of the state vector {y} that are represented onthe plot. Two of the elements of the state vectorrepresent performance constraints and they are thedashed lines. The third state represents the measure-of-merit for the design. Lines of iso-merit are plotted andthe “optimum” design occurs near the upper-right-handcorner of the figure. These lines would represent

“deterministic” representations of design requirementsand objectives as provided by the SA. Assuming that inorder to satisfy the design constraints, the values of thedesign variables must be “below” the two constraintlines. Visual inspection of this two-dimensional designspace indicates that the “optimum” design occurs with adesign vector {7,6} and is labeled with an “o” onFigure 7.10987Design Variable 2632100246Design Variable 1810Performance Constraints o Iso- Merit Lines For this set of 100 “designs” many of the resulting

designs would not be feasible, that is they would violatethe constraints (assuming for the moment that the statesrepresenting the constraints remain deterministic). If thedesigner were instead to select the design representedby the nominal values {7.5,5.2} and encounteredsimilar random variation in design variables nominalvalues (these designs are marked by “.” in Figure 9) amore robust situation would exist. Figure 9 illustratesthat almost all of these design would be feasible, a moredesirable condition.10987632100246810Figure 8. Effect of design variable “variability”10987632100246810Figure 7. A simple design exampleFor this design both state constraints are satisfied (i.e.the design is feasible) and the design is as close to theoptimum as any other feasible designs. But what

happens if, when the designer stipulates a design withdesign variable attributes {7,6}, in reality she will onlybe able to achieve designs with attributes {7󰀂󰀁󰀁󰀃6+󰀁󰀂}where 󰀁󰀁and 󰀁󰀂 are random variables representative ofthose introduced in manufacturing process, materialvariability or other sources of variability. Such a case isillustrated using the “+” symbols on the Figure 8.

Figure 9. Robust design with variabilityThis simple example does not include the influence ofvariability, uncertainty and bias errors in the predictedstates. The existence of these sources of total

uncertainty would actually alter the location of themerit and constraint boundaries in the design space.The design space as shown represents those boundariesas predicted by the SA, which may not agree with “thereal world” as a result of total uncertainty in themodeling and simulation provided by the SA. As

9

American Institute of Aeronautics and Astronautics

stressed above the system realizations cannot be know apriori and random numbers are just that, random!This brief example is intended to illustrate just a few ofthe issues related to systems design optimization withuncertainty. The ongoing challenge is to deal efficientlyand economically with the uncertainty in a manner thatrecognizes the influence that it will have in the designprocess.

FINAL THOUGHTS AND RECOMMENDATIONSOne of the key challenges in achieving widespreadacceptance of the use of MDO techniques inengineering design practice will be the effectiveintegration of issues related to technical risk anduncertainty in the model-based design of physical

artifacts. The authors would like to stress three areas inwhich continued progress must be made in order toachieve this end. The first is the continued developmentof consistent terminology for discussing these issues.Developing basic concepts and associated terminologywill allow for practitioners to effectively discuss andrefine ideas. The second is fostering the recognition thatmodeling and simulation information used in the designprocess must have associated with it “best” estimates ofuncertainty and these must be quantified. Modelingand simulation software that does not provide

quantified estimates of uncertainty will be of limitedvalue. It is often felt that “some number is better thatnothing” but “some number” without some measure ofthe risk associated with basing a decision on thenumber may not be better than “nothing”. Lastly,

techniques and design frameworks developed for MDOapplications must integrate with the automated, rationaldesign decision-making process the means to efficientlyuse quantified uncertainty information. The designermust eventually be able to express in a quantitativefashion the technical risk associated with a particulardesign, as the overall concept of “optimum” must

include risk. Continued research emphasis and progressin each of these areas will be needed in order to achievethe goal of widespread use of MDO in engineeringpractice.

ACKNOWLEDGEMENTSThe authors wish to acknowledge the contributions ofDrs. Raymond Brach and Amarjit Budhiraja, of theUniversity of Notre Dame, as well as Dr. William

Oberkampf, Sandia National Labs and Dr. David Olsen,Honeywell Aircraft Landing Systems. This effort wassupported in part by the National Science Foundationunder grant DMI98-12857.

REFERENCES1. Simon, H.A., The Science of the Artificial, MITPress, Boston, 1981.

2. Bevington, P.R., and Robinson, D.K., DataReduction and Error Analysis for the Physical Science,2nd ed., McGraw-Hill, New York, 1992.3. Coleman, H.W., and Steele, W.G., Jr.,

Experimentation and Uncertainty Analysis forEngineers, John Wiley & Sons, New York, 19.4. Kline, S. J., McClintock, F. A., \"DescribingUncertainties in Single-Sample Experiments,\"Mechanical Engineering, Vol. 75, No. 1, pp. 3-9,January 1953.

5. Batill, S. M., \"Experimental Uncertainty and DragMeasurements in the National Transonic Facility\NASA Contractor Report 4600, 1994.

6. Oberkampf, W.L., DeLand, S.M., Rutherford, B.M.,Diegert, K.V., Alvin, D.F., “ Estimation of TotalUncertainty in Modeling and Simulation,” Sandia

Report SAND2000-0824, Sandia National Laboratories,Albuquerque, New Mexico, April 2000.

BIBLIOGRAPHY1. Antonsson, E.K., and Otto, K.N., \"Imprecision inEngineering Design,\" ASME Journal of MechanicalEngineering, Vol. 117, June 1995, pp. 25-32.2. Ayyub, B.M., and Chao, R.U., \"UncertaintyModeling in Civil Engineering with Structural andReliability Applications,\" Uncertainty Modeling andAnalysis in Civil Engineering, B.M., CRC Press, pp.1-8, 1997.

3. Bier, V.M., \"Fuzzy Set Theory, Probability Theory,and Truth Functionality,\" in Analysis and Managementof Uncertainty: Theory and Applications, B.M. Ayyub,M.M. Gupta, and L.N. Kanal Eds., North-Holland, NewYork, 1992, pp 65-78.

4. Box, E.P., and Draper, N.R., Empirical Model-Building and Response Surfaces, John Wiley & Sons,New York, 1987.

5. Brown, S.A., and Sepulveda, A.E.,\"Approximationof System Reliability Using a Shooting Monte CarloApproach,\" AIAA Journal, Vol. 35, No. 6, pp. 10-1071, 1997.

6. Chen, W., Allen, J.K., Mistree, F., and Tsui, K.-L.,\"A Procedure for Robust Design: MinimizingVariations Caused by Noise Factors and ControlFactors,\" ASME Journal of Mechanical Design, Vol.118, pp. 478-485, 1996.

7. Du, X., and Chen, W., \"Propagation and

Management of Uncertainties in Simulation-BasedCollaborative System Design,\" 27-RDU-1, Proceedingsof the 3rd World Congress of Structural and

10

American Institute of Aeronautics and Astronautics

Multidisciplinary Optimization, Buffalo, New York,May 1999.

8. Du, X., and Chen, W., \"An Efficient Approach toProbabilistic Uncertainty Analysis in Simulation-BasedMultidisciplinary Design,\" AIAA-2000-0423,

Proceedings of the 41st AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and MaterialsConference, Atlanta, Georgia, April 2000.

9. Eggert, R.J., \"Quantifying Design Feasibility UsingProbabilistic Feasibility Analysis,\" 1991 ASMEAdvances in Design Automation, Paper No. DE-Vol.32-1, pp. 235-240, 1991.

10. French S., \"Uncertainty and Imprecision: Modellingand Analysis,\" Journal of the Operational ResearchSociety, Vol. 46, No. 1, Jan. 1995, pp. 70-79.11. Gu, X., Renaud, J.E. and Batill, S.M., \"An

Investigation of Multidisciplinary Design Subject toUncertainty,\" AIAA-98-4747, Proceedings of the 7thAIAA/NASA/USAF/ISSMO Symposium on

Multidisciplinary Analysis and Optimization, St. Louis,Missouri, September 1998.

12. Gu, X., Renaud, J.E. and Batill, S.M., \"AnInvestigation of Uncertainty in Multidisciplinary

Robust Design Optimization,\" 27-RDU-3, Proceedingsof the 3rd World Congress of Structural and

Multidisciplinary Optimization, Buffalo, New York,May 1999.

13. Hazelrigg, G.A., \"A Framework for Decision-BasedEngineering Design,\" Journal of Mechanical Design,Vol 120, No. 4, pp 653, 1998.

14. Hoybye, J.A., \"Model Error Propagation and DataCollection Design, An Application in Water QualityModeling,\" Water, Air and Soil Pollution, Vol. 103,No. 1-4, pp. 101-109, 1998.

15. Iman, R.L., and Conover, W.J., \"A Distribution-Free Approach to Introducing Rank Correlation AmongInput Variables,\" Communications in Statistics,

Simulation and Computation, Vol. 11, No.3, pp. 311-334, 1982.

16. Isukapalli, S.S., and Georgopoulos, P.G.,

\"Stochastic Response Surface Methods (SRSMs) forUncertainty Propagation: Application to Environmentaland Biological Systems,\" Risk Analysis, Vol. 18, pp.351-363, 1998.

17. Klir, G.J., \"Probabilistic versus PossibilitiesConceptualization of Uncertainty,\" in Analysis andManagement of Uncertainty: Theory and Applications,B.M. Ayyub, M.M. Gupta, and L.N. Kanal Eds., North-Holland, New York, 1992, pp 13-26.

18. Koch, P.N., Simpson, T.W., Allen, J.K., andMistree F., (1999), \"Statistical Approximations for

Multidisciplinary Design Optimization: The Problem ofSize,\" Journal of Aircraft, Vol. 36, No. 1, pp. 275-286.19. Laskey, K.B., \"Model Uncertainty: Theory andPractical Implications,\" IEEE Transactions on System,

Man, and Cybernetics - Part A: System and Human,Vol. 26, No.3, pp. 340-348, 1996.

20. Manners, W., \"Classification and Analysis ofUncertainty in Structural System,\" Proceedings of the3rd IFIP WG 7.5 Conference on Reliability andOptimization of Structural Systems, Berkeley,California, March 1990, pp. 251-260.

21. Mavris, D.V., Bandte, O., and DeLaurentis, D.A.,(1999), \"Robust Design Simulation: A ProbabilisticApproach to Multidisciplinary Design,\" Journal ofAircraft, Vol. 36, No. 1, pp. 298.

22. McKay, M.D., Beckman, R.J., and Conover, W.J.,(1979), \"A Comparison of Three Methods for SelectingValues of Input Variables in the Analysis of Outputfrom a Computer Code,\" Technometrics, Vol. 21, No.2, pp. 239-245.

23. Muhanna, R.L., and Mullen, R.L., \"Development ofInterval Based Methods for Fuzziness in ContinuumMechanics,\" IEEE Computer Society, The 3rd

International Symposium on Uncertainty Modeling andAnalysis and Annual Conference of the NorthAmerican Fuzzy Information Processing Society,College Park, MD, 1995.

24. Rubinstein, R.Y., Simulation and Monte CarloMethod, John Wiley & Sons, New York, 1981.25. Schlesinger, S., et. al, \"Terminology for ModelCredibility,\" Simulation, Vol. 32, No.3, pp 103-104,1979.

26. Scott, M.J., Antonsson, E.K., \"Preliminary VehicleStructure Design: An Industrial Application of

Imprecision in Engineering Design,\" DETC98/DTM-56, Proceedings of DETC'98, 1998 ASME DesignEngineering Technical Conferences, Atlanta, Georgia,September 1998.

27. Sues, R.H., Oakley, D.R., and Rhodes, G.S.,\"Multidisciplinary Stochastic Optimization,\"

Proceedings of the 10th Conference on EngineeringMechanics, Part 2, Vol. 2, Boulder, CO, May 1996.28. Walker, J.R., \"Practical Application of VarianceReduction Techniques in Probabilistic Assessments,\"the 2nd International Conference on Radioactive WasteManagement, Winnipeg, Manit, Canada, pp. 517-521,1986.

29. Wood, K.L., and Antonsson, E.K., \"A FuzzyApproach to Computational Tools for Preliminary

Engineering Design,\" Advances in Design Automation,ASME, pp. 263-271, 1987.

30. Wood, K.L, Antonsson, E.K., Beck, J.L.,

\"Representing Imprecision in Engineering Design:Comparing Fuzzy and Probability Calculus,\" Researchin Engineering Design, Vol. 1, No. 3-4, pp. 187-203,December 1990.

31. Zimmermann, H-J, \"An Application-oriented Viewof Modelling Uncertainty,\" European Journal ofOperational Research, Vol. 122, No. 2, pp. 190-198,2000.

11

American Institute of Aeronautics and Astronautics

因篇幅问题不能全部显示,请点此查看更多更全内容

Copyright © 2019- yrrf.cn 版权所有 赣ICP备2024042794号-2

违法及侵权请联系:TEL:199 1889 7713 E-MAIL:2724546146@qq.com

本站由北京市万商天勤律师事务所王兴未律师提供法律服务