68 results on '"Raymond R. Hill"'
Search Results
2. A scenario-based parametric analysis of the army personnel-to-assignment matching problem
- Author
-
Brian J. Lunday, Raymond R. Hill, and Matthew D. Ferguson
- Subjects
Mathematical optimization ,021103 operations research ,Linear programming ,Matching (graph theory) ,business.industry ,Computer science ,Mechanical Engineering ,0211 other engineering and technologies ,Energy Engineering and Power Technology ,020206 networking & telecommunications ,02 engineering and technology ,Management Science and Operations Research ,Stable marriage problem ,Set (abstract data type) ,Robustness (computer science) ,Talent management ,0202 electrical engineering, electronic engineering, information engineering ,business ,Assignment problem ,Categorical variable - Abstract
Purpose This study aims to compare linear programming and stable marriage approaches to the personnel assignment problem under conditions of uncertainty. Robust solutions should exhibit reduced variability of solutions in the presence of one or more additional constraints or problem perturbations added to some baseline problems. Design/methodology/approach Several variations of each approach are compared with respect to solution speed, solution quality as measured by officer-to-assignment preferences and solution robustness as measured by the number of assignment changes required after inducing a set of representative perturbations or constraints to an assignment instance. These side constraints represent the realistic assignment categorical priorities and limitations encountered by army assignment managers who solve this problem semiannually, and thus the synthetic instances considered herein emulate typical problem instances. Findings The results provide insight regarding the trade-offs between traditional optimization and heuristic-based solution approaches. Originality/value The results indicate the viability of using the stable marriage algorithm for talent management via the talent marketplace currently used by both the U.S. Army and U.S. Air Force for personnel assignments.
- Published
- 2020
- Full Text
- View/download PDF
3. Wavelet analysis of variance box plot
- Author
-
Jeffrey Williams, Eric Chicken, Raymond R. Hill, and Joseph J. Pignatiello
- Subjects
Statistics and Probability ,Box plot ,business.industry ,Functional data analysis ,Pattern recognition ,Variance (accounting) ,Articles ,Visualization ,Computer Science::Graphics ,Wavelet ,Data visualization ,Outlier ,Key (cryptography) ,Artificial intelligence ,Statistics, Probability and Uncertainty ,business ,Mathematics - Abstract
Functional box plots satisfy two needs; visualization of functional data, and the calculation of important box plot statistics. Data visualization illuminates key characteristics of functional sets missed by statistical tests and summary statistics. The calculation of box plot statistics for functional sets permits a novel comparison more suited to functional data. The functional box plot uses a depth method to visualize and rank smooth functional curves in terms of a mean, box, whiskers, and outliers. The functional box plot improves upon other classic functional data analysis tools such as functional principal components and discriminant analysis for outlier detection. This research adds wavelet analysis as a generating mechanism along with depth for functional box plots to visualize functional data and calculate relevant statistics. The wavelet analysis of variance box plot tool gives competitive error rates in Gaussian test cases with magnitude outliers, and outperforms the functional box plot, for Gaussian test cases with shape outliers. Further, we show wavelet analysis is well suited at approximating irregular and noisy functional data and show the enhanced capability of WANOVA box plots to classify shape outliers which follow a different pattern than other functional data for both simulated and real data instances.
- Published
- 2021
4. Modern Data Analytics for the Military Operational Researcher
- Author
-
Raymond R. Hill
- Subjects
Information engineering ,business.industry ,Statistical learning ,Computer science ,Realm ,Big data ,Key (cryptography) ,Data analysis ,Unsupervised learning ,business ,Data science ,Statistician - Abstract
The military establishment, mimicking the business world in many respects, is abuzz with the rapid growth in big data, data analytics, machine learning, and statistical learning using artificial intelligence as well as supervised and unsupervised learning. But the challenge is really understanding what all these concepts mean for the military and how do these concepts relate the fundamental statistical principles historically employed by the defense operations research analyst or statistician. This chapter defines the key terms in the area and discusses the key aspects of the modern data challenge: big data, data engineering, data science, data analytics, and data mining. The chapter then focuses on data analytics and the range of methods and concerns associated with data analytics, and a discussion of the some of the techniques within the data analytics realm. The chapter closes with recommendations and concerns for the future.
- Published
- 2020
- Full Text
- View/download PDF
5. An opinion dynamics model of meta-contrast with continuous social influence forces
- Author
-
J. O. Miller, Christopher W. Weimer, Douglas D. Hodson, and Raymond R. Hill
- Subjects
Statistics and Probability ,Flexibility (engineering) ,Computational complexity theory ,Basis (linear algebra) ,business.industry ,Management science ,Process (engineering) ,Computer science ,Field (Bourdieu) ,Contrast (statistics) ,Statistical and Nonlinear Physics ,Modular design ,business ,Social influence - Abstract
Opinion dynamics is the study of how opinions in a group of individuals change over time and opinion dynamics models attempt to mathematically formulate this process. This research lays the foundations for, and develops the meta-contrast influence field model, a novel opinion dynamics model based on self-categorization theory. It improves on the existing meta-contrast model by providing a properly scaled, continuous influence basis while replicating key results of the original model. This influence basis is modular in nature, allowing future research to include other competing psychological forces in the mathematical formulation of influence. This flexibility is achieved while drastically reducing computational complexity, making feasible larger models of more psychologically complex agents.
- Published
- 2022
- Full Text
- View/download PDF
6. Predicting success in United States Air Force pilot training using machine learning techniques
- Author
-
Phillip R. Jenkins, Raymond R. Hill, and William N. Caballero
- Subjects
Economics and Econometrics ,Critical time ,Computer science ,business.industry ,Project commissioning ,Strategy and Management ,media_common.quotation_subject ,Geography, Planning and Development ,Economic shortage ,Management Science and Operations Research ,Machine learning ,computer.software_genre ,Variety (cybernetics) ,Tree (data structure) ,Quality (business) ,Artificial intelligence ,Statistics, Probability and Uncertainty ,business ,Pilot training ,computer ,media_common - Abstract
The chronic pilot shortage that has plagued the United States Air Force over the past three years poses a national-level problem that senior military members are working to overcome. Unfortunately, not all pilot candidates successfully complete the necessary training requirements to become fully qualified Air Force pilots, which wastes critical time and resources and only further exacerbates the pilot shortage problem. Therefore, it is important for the Air Force to carefully consider whom they select to attend pilot training. This research examines historical specialized undergraduate pilot training (SUPT) candidate data leveraging a variety of machine learning techniques to obtain insights on candidate success. Computational experimentation is performed to determine how selected machine learning techniques and their respective hyperparameters affect solution quality. Results reveal that the extremely randomized tree machine learning technique can achieve nearly 94% accuracy in predicting candidate success. Additional analysis indicates degree type and commissioning source are the most important features in determining candidate success. Ultimately, this research can inform the modification of future SUPT candidate selection criteria and other related Air Force personnel policies.
- Published
- 2022
- Full Text
- View/download PDF
7. A case study in engineering model validation using new wavelet‐based methods
- Author
-
Cara C. Rupp, Raymond R. Hill, Andrew D. Atkinson, and Kaitlyn M. Jones
- Subjects
Wavelet ,Computer science ,business.industry ,Pattern recognition ,Artificial intelligence ,Management Science and Operations Research ,Safety, Risk, Reliability and Quality ,business ,Model validation - Published
- 2018
- Full Text
- View/download PDF
8. Wavelet ANOVA bisection method for identifying simulation model bias
- Author
-
Raymond R. Hill, G. Geoffrey Vining, Eric Chicken, Andrew D. Atkinson, Joseph J. Pignatiello, and Edward D. White
- Subjects
Correctness ,Computer science ,business.industry ,02 engineering and technology ,Interval (mathematics) ,Machine learning ,computer.software_genre ,01 natural sciences ,Set (abstract data type) ,010104 statistics & probability ,Wavelet ,Hardware and Architecture ,Modeling and Simulation ,0202 electrical engineering, electronic engineering, information engineering ,Bisection method ,Verification and validation of computer simulation models ,020201 artificial intelligence & image processing ,Artificial intelligence ,0101 mathematics ,Representation (mathematics) ,business ,computer ,Software ,Verification and validation - Abstract
High-resolution computer models can simulate complex systems and processes in order to evaluate a solution quickly and inexpensively. Many simulation models produce dynamic functional output, such as a set of time-series data generated during a process. These computer models require verification and validation (V&V) to assess the correctness of these simulations. In particular, the model validation effort evaluates if the model is an appropriate representation of the real-world system that it is meant to simulate. However, when assessing a model capable of generating functional output, it is useful to learn more than simply whether the model is valid or invalid. Specifically, if the model is deemed invalid, then what aspects of the model are incorrect? Is it possible to identify over what range the model data are a poor representation of the system data? Current V&V methods cannot identify these ranges. This paper proposes a wavelet analysis of variance (WANOVA) bisection method that first assesses model validity and can also identify the interval(s) over which the model is biased. The technique is illustrated using several simulation studies. Ultimately, this new method supports and expands the efficacy of model validation efforts.
- Published
- 2018
- Full Text
- View/download PDF
9. Using dynamic Bayesian networks as simulation metamodels based on bootstrapping
- Author
-
J. O. Miller, Kenneth W. Bauer, Raymond R. Hill, and Clayton T. Kelleher
- Subjects
021103 operations research ,General Computer Science ,Computer science ,business.industry ,Bootstrapping ,Computation ,0211 other engineering and technologies ,General Engineering ,Probabilistic logic ,Bayesian network ,02 engineering and technology ,Machine learning ,computer.software_genre ,01 natural sciences ,Field (computer science) ,Modeling and simulation ,010104 statistics & probability ,Artificial intelligence ,0101 mathematics ,business ,computer ,Dynamic Bayesian network - Abstract
Modeling and Simulation (M&S) is an ever growing field that utilizes increasingly complex and expensive simulations often requiring lots of time to run, can be difficult to maintain and analyze, but whose use continues to provide useful insight into leadership decisions. In a time critical decision situation, running such models may not be a practical approach to provide leadership the insights they need when they need those insights. Bayesian Networks (BN) are a useful analytical tool that offer understandable probabilistic relationships between variables, efficient computation, and quick analytical results. A BN indexed by time is a Dynamic Bayesian Network (DBN). A drawback to their use as a meta-model is the number of runs needed to train either the BN or DBN. This research introduces and demonstrates the use of bootstrapping techniques to reduce the number of actual simulation runs necessary to sufficiently train DBNs used as simulation meta-models. DBNs trained with independent simulation runs are compared to DBNs trained with bootstrapped-based samples developed on various sized subsets of actual simulation runs, the DBNs are compared statistically, and empirical examples are used to demonstrate the methodology. A method extending the DBN meta-model to multiple dimensions is presented and demonstrated. A case study using an actual combat simulation model completes the presentation.
- Published
- 2018
- Full Text
- View/download PDF
10. Quantifying Parameter Variability on a Population of Aerospace Synchronous Generators
- Author
-
William Perdikakis, Chase Kitzmiller, Raymond R. Hill, Brett A. Robbins, Kevin J. Yost, and Kaitlyn M. Jones
- Subjects
education.field_of_study ,Computer science ,business.industry ,Population ,Parameterized complexity ,Control engineering ,Atmospheric model ,Maintenance engineering ,Variety (cybernetics) ,Electric power system ,Component (UML) ,education ,Aerospace ,business - Abstract
Electric machines are a critical component for the power system architecture of modern and future aircraft. Consequently, it is necessary that the representative machine models accurately capture the dynamic response of these machines for a variety of operating conditions and system configurations. In this paper, a population of fielded electric machines at various stages in their life cycles are tested. The observed responses of the machine are parameterized and compared using statistical methods.
- Published
- 2019
- Full Text
- View/download PDF
11. Applying statistical engineering to the development of a ballistic impact flash model
- Author
-
Jaime J. Bestard, Dana F. Morrill, Raymond R. Hill, Thomas P. Talafuse, and Darryl K. Ahner
- Subjects
Engineering ,business.industry ,Fragment (computer graphics) ,05 social sciences ,Survivability ,01 natural sciences ,Industrial and Manufacturing Engineering ,Time series modeling ,010104 statistics & probability ,Flash (photography) ,Missile ,0502 economics and business ,0101 mathematics ,Aerospace engineering ,Engineering statistics ,Safety, Risk, Reliability and Quality ,business ,050203 business & management ,Ballistic impact - Abstract
Military aircraft must often fly in hostile environments. Missile explosions generate fragments that may impact the aircraft. These fragment impacts can penetrate the aircraft and may cause flash fires external and internal to the aircraft which may cau..
- Published
- 2016
- Full Text
- View/download PDF
12. Back in business: operations research in support of big data analytics for operations and supply chain management
- Author
-
Raymond R. Hill, Benjamin T. Hazen, Christopher A. Boone, and Joseph B. Skipper
- Subjects
021103 operations research ,Knowledge management ,Supply chain management ,Computer science ,business.industry ,05 social sciences ,Big data ,0211 other engineering and technologies ,General Decision Sciences ,Context (language use) ,02 engineering and technology ,Management Science and Operations Research ,Business operations ,Data science ,Field (computer science) ,Work (electrical) ,Analytics ,0502 economics and business ,Behavioral operations research ,business ,050203 business & management - Abstract
Few topics have generated more discourse in recent years than big data analytics. Given their knowledge of analytical and mathematical methods, operations research (OR) scholars would seem well poised to take a lead role in this discussion. Unfortunately, some have suggested there is a misalignment between the work of OR scholars and the needs of practicing managers, especially those in the field of operations and supply chain management where data-driven decision-making is a key component of most job descriptions. In this paper, we attempt to address this misalignment. We examine both applied and scholarly applications of OR-based big data analytical tools and techniques within an operations and supply chain management context to highlight their future potential in this domain. This paper contributes by providing suggestions for scholars, educators, and practitioners that aid to illustrate how OR can be instrumental in solving big data analytics problems in support of operations and supply chain management.
- Published
- 2016
- Full Text
- View/download PDF
13. Using design of experiments methods for applied computational fluid dynamics: A case study
- Author
-
Christopher L. Martin, Alex J. Gutman, Mark F. Reeder, Raymond R. Hill, and Timothy A. Cleaver
- Subjects
020301 aerospace & aeronautics ,Engineering ,Angle of attack ,business.industry ,Design of experiments ,02 engineering and technology ,Aerodynamics ,Computational fluid dynamics ,01 natural sciences ,Industrial and Manufacturing Engineering ,010104 statistics & probability ,symbols.namesake ,Missile ,Surrogate model ,0203 mechanical engineering ,Latin hypercube sampling ,symbols ,0101 mathematics ,Safety, Risk, Reliability and Quality ,business ,Gaussian process ,Simulation - Abstract
This article presents an application of design of experiments (DOE) in a computational fluid dynamics (CFD) environment to study forces and moments acting on a missile through various speeds and angles of attack. Researchers employed a four-factor Latin hypercube space-filling design and the Gaussian Process to build a surrogate model of the CFD environment. The surrogate model was used to characterize missile aerodynamic coefficients across the transonic flight regime. The DOE process completed the task with less computational resources than a traditional one-factor-at-a-time (OFAT) approach. To validate the surrogate model, specific OFAT angle of attack sweeps were performed. This provided a direct comparison between the Gaussian Process model and OFAT analysis. In most cases, the surrogate computer model was able to accurately capture the nonlinear response variables. Moreover, the surrogate model enabled a dynamic prediction tool that could investigate untested scenarios, a capability not avai...
- Published
- 2016
- Full Text
- View/download PDF
14. Development of an agent-based model for the secondary threat resulting from a ballistic impact event
- Author
-
Frank W. Ciarallo, Raymond R. Hill, and M. J. Bova
- Subjects
Agent-based model ,0209 industrial biotechnology ,021103 operations research ,Event (computing) ,business.industry ,Computer science ,Simulation modeling ,0211 other engineering and technologies ,Survivability ,02 engineering and technology ,Debris ,020901 industrial engineering & automation ,Missile ,Modeling and Simulation ,Aerospace engineering ,Discrete event simulation ,business ,Software ,Simulation ,Ballistic impact - Abstract
Military aircraft must often operate in hostile environments. A worrisome threat to aircraft are the high velocity fragments emanating from missile detonations near the aircraft. These fragments may impact and penetrate the aircraft, causing fires in the aircraft. The process by which a high-velocity impact event leads to fire ignition onboard military vehicles is complex, influenced by the interaction of heated debris fragments and fuel spurting from ruptured tanks. An assessment of the risk of such a fire begins with a complete characterization of the secondary threat resulting from the impact, including debris fragment sizes, states of motion, and thermal properties. In the aircraft survivability community, there is a need for an analytical tool to model this complete threat. This paper approaches the problem by describing an agent-based simulation model of the fragments in a debris cloud. An analytical/empirical impact fragmentation model is developed for incorporation into the simulation model, which determines fragment sizes and states of motion. Development and study of this proof-of-concept effort leads to a deeper understanding of such secondary threats and demonstrates the value of agent-based simulation models as an analytical tool. Empirical assessment of model results indicates the viability of the approach.
- Published
- 2016
- Full Text
- View/download PDF
15. Editorial: Welcome to the Journal of Defense Analytics and Logistics
- Author
-
Raymond R. Hill and Ben Hazen
- Subjects
Engineering ,Analytics ,business.industry ,business ,Data science - Published
- 2017
- Full Text
- View/download PDF
16. Case Studies: Definitive Screening Applied to a Simulation Study of the F100-229 Engine Repair Network
- Author
-
Tom D. Stafford, Kelly R. Bush, Alex J. Gutman, Roger D. Moulder, and Raymond R. Hill
- Subjects
Eagle ,Government ,Engineering ,biology ,business.industry ,biology.animal ,Screening method ,Safety, Risk, Reliability and Quality ,business ,Inventory cost ,Industrial and Manufacturing Engineering ,Manufacturing engineering - Abstract
Problem: An adequate supply of F100-229 engines, which power the F-15 Eagle and late-model F-16 fighter jets, is vital to meet the operational demands of the Air Force. This supply is necessarily limited by government and inventory cost considerations. ..
- Published
- 2015
- Full Text
- View/download PDF
17. Analysis of an Intervention for Small Unmanned Aerial System (SUAS) Accidents: A Case Study Involving Simpson's Paradox
- Author
-
Joseph J. Pignatiello, Raymond R. Hill, and Sean E. Wolf
- Subjects
Engineering ,Intervention (law) ,Operations research ,Aeronautics ,business.industry ,Process (engineering) ,Accident analysis ,Certification ,Safety, Risk, Reliability and Quality ,business ,Industrial and Manufacturing Engineering ,Simpson's paradox - Abstract
This case study examines a process intervention designed to lower small unmanned aerial system (SUAS) mishap rates. The intervention implemented review boards, required various certifications, and produced apparent decreases in flight failures and fligh..
- Published
- 2015
- Full Text
- View/download PDF
18. Testing local search move operators on the vehicle routing problem with split deliveries and time windows
- Author
-
Marcus E. McNabb, Jeffery D. Weir, Raymond R. Hill, and Shane N. Hall
- Subjects
Mathematical optimization ,Ideal (set theory) ,General Computer Science ,Heuristic (computer science) ,Heuristic ,Computer science ,business.industry ,Ant colony optimization algorithms ,media_common.quotation_subject ,Management Science and Operations Research ,Time windows ,Modeling and Simulation ,Vehicle routing problem ,Quality (business) ,Local search (optimization) ,business ,media_common - Abstract
The vehicle routing problem (VRP) is an important transportation problem. The literature addresses several extensions of this problem, including variants having delivery time windows associated with customers and variants allowing split deliveries to customers. The problem extension including both of these variations has received less attention in the literature. This research effort sheds further light on this problem. Specifically, this paper analyzes the effects of combinations of local search (LS) move operators commonly used on the VRP and its variants. We find when paired with a MAX-MIN Ant System constructive heuristic, Or-opt or 2-opt* appear to be the ideal LS operators to employ on the VRP with split deliveries and time windows with Or-opt finding higher quality solutions and 2-opt* requiring less run time.
- Published
- 2015
- Full Text
- View/download PDF
19. Dynamic Model Validation Metric Based on Wavelet Thresholded Signals
- Author
-
Joseph J. Pignatiello, Andrew D. Atkinson, Eric Chicken, G. Geoffrey Vining, Raymond R. Hill, and Edward D. White
- Subjects
Statistics and Probability ,business.industry ,Speech recognition ,Pattern recognition ,01 natural sciences ,010305 fluids & plasmas ,Computer Science Applications ,Model validation ,010101 applied mathematics ,Wavelet ,Computational Theory and Mathematics ,Modeling and Simulation ,0103 physical sciences ,Metric (mathematics) ,Artificial intelligence ,0101 mathematics ,business ,Mathematics - Abstract
Model validation is a vital step in the simulation development process to ensure that a model is truly representative of the system that it is meant to model. One aspect of model validation that deserves special attention is when validation is required for the transient phase of a process. The transient phase may be characterized as the dynamic portion of a signal that exhibits nonstationary behavior. A specific concern associated with validating a model's transient phase is that the experimental system data are often contaminated with noise, due to the short duration and sharp variations in the data, thus hiding the underlying signal which models seek to replicate. This paper proposes a validation process that uses wavelet thresholding as an effective method for denoising the system and model data signals to properly validate the transient phase of a model. This paper utilizes wavelet thresholded signals to calculate a validation metric that incorporates shape, phase, and magnitude error. The paper compares this validation approach to an approach that uses wavelet decompositions to denoise the data signals. Finally, a simulation study and empirical data from an automobile crash study illustrates the advantages of our wavelet thresholding validation approach.
- Published
- 2017
- Full Text
- View/download PDF
20. A History of Military Computer Simulation
- Author
-
Raymond R. Hill and Andreas Tolk
- Subjects
Military personnel ,Engineering ,Aeronautics ,business.industry ,Military computers ,Military simulation ,Pillar ,Training (meteorology) ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Plan (drawing) ,Track (rail transport) ,business ,Winter simulation - Abstract
The history of military simulation dates back to the earliest ages of conflict. Throughout the history, military personnel has employed cognitive devices to comprehend and plan for conflict. This chapter provides a history of military simulation, starting with a review of the board-based war games , progressing through the computerization of those war games and completing with a discussion of the modern world of military simulation encompassing desktop-based analytical simulations, video games, and distributed training and analytical environments. It provides a background for the contributions of the military track that has been a pillar of the Winter Simulation Conference for many years.
- Published
- 2017
- Full Text
- View/download PDF
21. Using Neural Networks and Logistic Regression to Model Small Unmanned Aerial System Accident Results for Risk Mitigation
- Author
-
Sean E. Wolf, Joseph J. Pignatiello, and Raymond R. Hill
- Subjects
Engineering ,Artificial neural network ,business.industry ,System accident ,Data mining ,computer.software_genre ,business ,Logistic regression ,computer ,Risk management - Published
- 2014
- Full Text
- View/download PDF
22. Nonlinear screening designs for defense testing: an overview and case study
- Author
-
James R. Simpson, Shane Dougherty, Joseph J. Pignatiello, Edward D. White, and Raymond R. Hill
- Subjects
Nonlinear system ,Engineering ,Subject-matter expert ,Operations research ,business.industry ,Modeling and Simulation ,Design of experiments ,Screening design ,Systems engineering ,Operational acceptance testing ,business ,Engineering (miscellaneous) ,Wind tunnel - Abstract
Recent efforts to improve the statistical rigor of both developmental and operational testing include recommendations to consider second-order screening designs. In this paper, we provide a broad review of screening designs and second-order screening designs. We then consider a real-world wind tunnel experiment case study from the recent published literature and use subject matter expertise to consider the viability of second-order screening design, a recently developed design called the DSD+, to sufficiently screen important factors in complex, nonlinear systems of interest using just a single experimental design.
- Published
- 2014
- Full Text
- View/download PDF
23. Bridging the Gap between Quantitative and Qualitative Accelerated Life Tests
- Author
-
Raymond R. Hill, Richard L. Warr, Jason K. Freels, and Joseph J. Pignatiello
- Subjects
Estimation ,Engineering ,Bridging (networking) ,business.industry ,media_common.quotation_subject ,Failure data ,Management Science and Operations Research ,Accelerated life testing ,Test (assessment) ,Reliability engineering ,Quality (business) ,Safety, Risk, Reliability and Quality ,business ,Analysis method ,Reliability (statistics) ,media_common - Abstract
Test planners have long sought the ability to incorporate the results of highly accelerated life testing (HALT) into an early estimate of system reliability. While case studies attest to the effectiveness of HALT in producing reliable products, the capability to translate the test's limited failure data into a meaningful measure of reliability improvement remains elusive. Further, a review of quality and reliability literature indicates that confusion exists over what defines a HALT and how HALT differs from quantitative accelerated life testing methods. Despite many authors making a clear distinction between qualitative and quantitative accelerated life tests, an explanation as to why this delineation exists cannot be found. In this paper, we consider an exemplary HALT composed of a single stressor to show that the HALT philosophy precludes the estimation of a system's hazard rate function parameters because of the test's fix implementation strategy. Four common accelerated failure data analysis methods are highlighted to show their limitations with respect to estimating reliability from HALT data. Finally, a modified accelerated reliability growth test is proposed as a way forward for future research in HALT scenarios to characterize the risk of attaining a reliability requirement and improve parameter estimation. Copyright © 2014 John Wiley & Sons, Ltd.
- Published
- 2014
- Full Text
- View/download PDF
24. A structural taxonomy for metaheuristic optimisation search methods
- Author
-
Edward A. Pohl and Raymond R. Hill
- Subjects
Adaptive memory ,Polymers and Plastics ,Computer science ,business.industry ,Structural component ,Machine learning ,computer.software_genre ,Industrial and Manufacturing Engineering ,Empirical research ,Search algorithm ,Artificial intelligence ,Business and International Management ,business ,computer ,Metaheuristic - Abstract
Metaheuristic search algorithms have become ubiquitous in the applied optimisation world. Various works have appeared classifying and improving these algorithms and the particular processes embedded within the algorithms. Successful metaheuristic approaches have a common general structure to their search processes. To this end, we offer a structural taxonomy of metaheuristic search methods. This taxonomy serves as a framework for constructing and evaluating metaheuristic approaches from a general structural perspective as well as for conducting empirical research regarding the effectiveness of more detailed structural components. Implementation mechanisms of the detailed components within each structural component are left for future taxonomy research and development.
- Published
- 2019
- Full Text
- View/download PDF
25. Effect of Heredity and Sparsity on Second-Order Screening Design Performance
- Author
-
Raymond R. Hill, Shane Dougherty, James R. Simpson, Edward D. White, and Joseph J. Pignatiello
- Subjects
Engineering ,business.industry ,Management Science and Operations Research ,medicine.disease_cause ,Random noise ,Screening design ,Heredity ,Statistics ,Screening method ,medicine ,Order (group theory) ,Safety, Risk, Reliability and Quality ,business ,Focus (optics) ,Algorithm ,Type I and type II errors - Abstract
Two recent and powerful screening designs are compared. The comparisons focus on how robust each design is with respect to assumptions of model heredity and sparsity. Four truth models having varied aspects of heredity and sparsity are used; each examined at four levels of random noise. Each screening method is analyzed against the random data using the method's proposed analysis approach and compared in terms of correctly identified model components, incorrectly identified model components (type I error), and missed model components (type II error). Copyright © 2013 John Wiley & Sons, Ltd.
- Published
- 2013
- Full Text
- View/download PDF
26. The art and science of live, virtual, and constructive simulation for test and analysis
- Author
-
Raymond R. Hill and Douglas D. Hodson
- Subjects
Computer science ,business.industry ,Modeling and Simulation ,Systems engineering ,Live, virtual, and constructive ,Software engineering ,business ,Engineering (miscellaneous) ,Constructive ,Test (assessment) - Abstract
Live, virtual, and constructive (LVC) simulation technologies are well-established in the areas of technology demonstration, mission rehearsal, and exercises. A promising new role for LVC simulation technology is to facilitate weapon systems testing by producing defendable results. Requirements to test new defense systems within a system-of-systems context and within joint force scenarios have placed demands on physical test ranges that are not likely to be met. The LVC testing option promises the breadth and depth of defense systems, either as real or simulated assets, to potentially meet the new test demands. However, leveraging this technology to support testing will require a shift in the approaches used by the LVC community. In this paper, we discuss the challenges facing the LVC testing initiative, both from an experimental and from an architectural and implementation standpoint.
- Published
- 2013
- Full Text
- View/download PDF
27. Quantifying radar measurement errors in a Live–Virtual–Constructive environment to determine system viability: a case study
- Author
-
Douglas D. Hodson, Bruce L. Esken, Alex J. Gutman, and Raymond R. Hill
- Subjects
Engineering ,Correctness ,business.industry ,media_common.quotation_subject ,Real-time computing ,Fidelity ,Jamming ,Replicate ,Constructive ,law.invention ,Geographic coordinate conversion ,law ,Modeling and Simulation ,Radar ,business ,Engineering (miscellaneous) ,Simulation ,media_common ,Abstraction (linguistics) - Abstract
We present a case study that attempts to replicate the realism of a test range using a Live, Virtual and Constructive (LVC) simulation. Because resources are limited on a real test range, the Air Force Simulation and Analysis Facility at Wright-Patterson Air Force Base, Ohio, was tasked to build a simulation that emulated, to a high degree of fidelity, test range assets – the goal was to understand the impact of jamming on an enemy’s integrated air defense system. Before testing began, the simulated effects generated by the LVC simulation have to be viable, robust and realistic in order to provide credible data during later phases of experimentation. We present our approach to evaluating the correctness of the LVC simulation that considers sources of error associated with real-world radar measurements, errors caused by model abstraction and errors introduced due to real-time distributed simulation architectures, including shared state-space inconsistencies, coordinate conversion issues and a common time reference.
- Published
- 2013
- Full Text
- View/download PDF
28. The In-Transit Vigilant Covering Tour Problem for Routing Unmanned Ground Vehicles
- Author
-
Huang Teng Tan and Raymond R. Hill
- Subjects
Engineering ,Mathematical optimization ,Unmanned ground vehicle ,business.industry ,Vertex cover ,Travelling salesman problem ,Benchmark (computing) ,Key (cryptography) ,Robot ,Enhanced Data Rates for GSM Evolution ,Routing (electronic design automation) ,business ,Simulation ,MathematicsofComputing_DISCRETEMATHEMATICS - Abstract
The routing of unmanned ground vehicles for the surveillance and protection of key installations is modeled as a new variant of the Covering Tour Problem (CTP). The CTP structure provides both the routing and target sensing components of the installation protection problem. Our variant is called the in-transit Vigilant Covering Tour Problem (VCTP) and considers not only the vertex cover but also the additional edge coverage capability of the unmanned ground vehicle while sensing in-transit between vertices. The VCTP is formulated as a Traveling Salesman Problem (TSP) with a dual set covering structure involving vertices and edges. An empirical study compares the performance of the VCTP against the CTP on test problems modified from standard benchmark TSP problems to apply to the VCTP. The VCTP performed generally better with shorter tour lengths but at higher computational cost.
- Published
- 2016
- Full Text
- View/download PDF
29. Experimental Design for Unmanned Aerial Systems Analysis
- Author
-
Brian B. Stone and Raymond R. Hill
- Subjects
Systems analysis ,Aeronautics ,Computer science ,business.industry ,XM1216 Small Unmanned Ground Vehicle ,Aerospace engineering ,business - Published
- 2016
- Full Text
- View/download PDF
30. Using simulation to analyze the maintenance architecture for a USAF weapon system
- Author
-
Raymond R. Hill, Daniel D. Mattioda, and Ricardo Garza
- Subjects
Design framework ,Potential impact ,Engineering ,Operations research ,business.industry ,Computer Graphics and Computer-Aided Design ,Weapon system ,Resource (project management) ,Modeling and Simulation ,Aircraft maintenance ,Architecture ,Discrete event simulation ,business ,Software - Abstract
The United States Air Force (USAF) is investigating the use of three levels of repair with its aircraft maintenance managerial structure. This study provides an initial look at the effect of maintenance resource collaboration among maintenance locations and the use of a centralized repair facility focusing on a critical line replacement unit for a major USAF weapon system. Maintenance data for prior year maintenance experiences are collected, fit into appropriate probability distributions and implemented in a discrete event simulation model. This model is then used within an experimental design framework to examine the potential impact of organizational changes to the USAF hierarchical maintenance structure.
- Published
- 2012
- Full Text
- View/download PDF
31. An agent-based modeling approach to analyze the impact of warehouse congestion on cost and performance
- Author
-
Brian L. Heath, Frank W. Ciarallo, and Raymond R. Hill
- Subjects
Distribution center ,Engineering ,business.industry ,Mechanical Engineering ,media_common.quotation_subject ,computer.software_genre ,Industrial engineering ,Industrial and Manufacturing Engineering ,Computer Science Applications ,Warehouse ,Simulation software ,Control and Systems Engineering ,Order (exchange) ,Component (UML) ,Key (cryptography) ,Systems engineering ,Conceptual model ,business ,computer ,Software ,media_common - Abstract
This article discusses a novel agent-based modeling (ABM) approach to analyze the impact of warehouse congestion and presents results indicating the significant effect of congestion on cost and performance in various scenarios. In particular, the simulation represents the behaviors of the order pickers in a picker-to-part, low picking warehouse and focuses on representing the traffic and movements of the pickers. The key motivation for simulating this system is the lack of literature discussing models or simulations capable of representing the congestion component of order pickers, a component important in actual warehouse operations. The conceptual model of the simulation is described and justified using the Conceptual Model for Simulation Diagram™ and the simulation is constructed using the simulation software AnyLogic®. The simulation is operationally validated via a series of experiments performed to test the simulation’s results against the expected dynamics of the system as described in (Tompkins et al. 2003). After operationally validating the simulation, key results are discussed and it is shown that the ABM simulation paradigm is capable of quantitatively capturing new and traditionally difficult to explore dynamics in warehouse operations, including components of congestion not considered in literature.
- Published
- 2012
- Full Text
- View/download PDF
32. Acquisition and Testing, DT/OT Testing: The Need for Two-Parameter Requirements
- Author
-
Stephen P. Chambal, Jerry W. Kitchen, Alex J. Gutman, and Raymond R. Hill
- Subjects
Engineering ,Two parameter ,Operations research ,business.industry ,Analogy ,System requirements specification ,Management Science and Operations Research ,Safety, Risk, Reliability and Quality ,Design methods ,business ,Statistical hypothesis testing ,Test (assessment) - Abstract
An important initiative within the Department of Defense is the increased emphasis on using experimental design methods for test and evaluation. In this work, we discuss the underlying assumptions associated with hypothesis testing using an analogy to the legal system, tie this analogy to test and make the case for the use of two parameters in system specification, an objective and a threshold level. Copyright © 2012 John Wiley & Sons, Ltd.
- Published
- 2012
- Full Text
- View/download PDF
33. Application of agent based modelling to aircraft maintenance manning and sortie generation
- Author
-
Stephen P. Chambal, J. O. Miller, Raymond R. Hill, and Adam MacKenzie
- Subjects
Generation process ,Prioritization ,Engineering ,Operations research ,business.industry ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Task (project management) ,Unit (housing) ,Hardware and Architecture ,Modeling and Simulation ,Aircraft maintenance ,business ,Resource assignment ,Software - Abstract
This research develops an agent based simulation model for application to the sortie generation process, focusing on a single fighter aircraft unit. The simulation includes representations of each individual maintainer within the unit, along with supervisory agents that provide direction in the form of dynamic task prioritization and resource assignment. Using a high-fidelity depiction of each entity, an exploration of the effects of different mixes of skill levels and United States Air Force Specialty Codes (AFSCs) on sortie production is performed. Analysis is conducted using an experimental design with results presented demonstrating the effects of maintenance manning decisions on the Combat Mission Readiness (CMR) of a fighter unit.
- Published
- 2012
- Full Text
- View/download PDF
34. Examining improved experimental designs for wind tunnel testing using Monte Carlo sampling methods
- Author
-
Shay R. Capehart, August G. Roesener, Raymond R. Hill, and Derek A. Leggio
- Subjects
Engineering ,business.industry ,Design of experiments ,Monte Carlo method ,Experimental data ,Management Science and Operations Research ,Safety, Risk, Reliability and Quality ,business ,Wind tunnel test ,Simulation ,Marine engineering ,Wind tunnel - Abstract
Wind tunnels are used in the design and testing of a wide variety of systems and products. Wind tunnel test campaigns involve a large number of experimental data points, can take a long time to accomplish, and can consume tremendous resources. Design of Experiments is a systematic, statistically based approach to experimental design and analysis that has the potential to improve the efficiency and effectiveness of wind tunnel testing. In Defense Acquisition, wind tunnel testing of aircraft systems may require years of effort to fully characterize the system of interest. We employ data from a fairly large legacy wind tunnel test campaign and compare that data's corresponding response surface to the response surfaces derived from data generated using smaller, statistically motivated experimental design strategies. The comparison is accomplished using a Monte Carlo sampling methodology coupled with a statistical comparison of the system's estimated response surfaces. Initial results suggest a tremendous opportunity to reduce wind tunnel test efforts without losing test information. Published in 2010 by John Wiley & Sons, Ltd.
- Published
- 2010
- Full Text
- View/download PDF
35. Assessing Transport Aircraft Inspection Strategies
- Author
-
Theodore K. Heiman, Alan W. Johnson, Raymond R. Hill, and Martha C. Cooper
- Subjects
Engineering ,Downtime ,Schedule ,Information Systems and Management ,Computer Networks and Communications ,business.industry ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Workload ,Computer Science Applications ,Management Information Systems ,Reliability engineering ,Transport engineering ,Mission effectiveness ,Computational Theory and Mathematics ,Periodic maintenance ,business ,Information Systems - Abstract
Complex aircraft require periodic maintenance checks to assess needed repairs for continued vehicle availability. However, such checks are expensive and the associated aircraft downtime can reduce fleet mission effectiveness. The United States Air Force plans to consolidate the time-based (isochronal) C-5 aircraft major inspection activities for eight C-5 home stations into three locations. Isochronal inspections rely on a calendar method to schedule inspections and disregard actual flying hours between inspections. By having the same personnel perform these inspections for all flying units and by adopting commercial aircraft condition-based inspection strategies, the Air Force hopes to gain efficiencies in performing these inspections. Conversely, the site phase-out schedule and reduced number of inspection locations raises questions about whether overall C-5 mission capability will be reduced. These proposed revisions were simulated in a designed experiment to assess the impacts to fleet availability and inspection site workload.
- Published
- 2010
- Full Text
- View/download PDF
36. Some insights into the emergence of agent-based modelling
- Author
-
Raymond R. Hill and Brian L. Heath
- Subjects
050210 logistics & transportation ,021103 operations research ,business.industry ,Management science ,Computer science ,05 social sciences ,0211 other engineering and technologies ,Context (language use) ,02 engineering and technology ,Cellular automaton ,Modeling and Simulation ,0502 economics and business ,Cybernetics ,Systems thinking ,Artificial intelligence ,Discrete event simulation ,business ,Complex adaptive system ,Software - Abstract
Agent-based modeling (ABM) has become a popular simulation analysis tool and has been used to examine systems from myriad domains. This article re-examines some of the scientific developments in computers, complexity, and systems thinking that helped lead to the emergence of ABM by shedding new light onto some old theories and connecting them to several key ABM principles of today. As it is often the case, examining history can lead to insightful views about the past, present, and the future. Thus, themes from cellular automata and complexity, cybernetics and chaos, and complex adaptive systems are examined and placed in historical context to better establish the application, capabilities, understanding, and future of ABM.
- Published
- 2010
- Full Text
- View/download PDF
37. A Simulation Validation Method Based on Bootstrapping Applied to an Agent-based Simulation of the Bay of Biscay Historical Scenario
- Author
-
Raymond R. Hill and Lance E. Champagne
- Subjects
Engineering ,Operations research ,Bootstrapping ,business.industry ,Modeling and Simulation ,Stochastic simulation ,Nonparametric statistics ,Verification and validation of computer simulation models ,Statistical analysis ,Representation (mathematics) ,business ,Engineering (miscellaneous) ,Bay - Abstract
Combat, unlike many real-world processes, tends to be singular in nature. This makes statistical analysis of the combat data problematic. Building stochastic simulation models of combat scenarios provides a means of studying in some detail the particulars of a combat scenario provided that the model accurately captures the scenario. The scenario of interest in this paper is the WWII Bay of Biscay U-boat campaign, and the model is an agent-based simulation. The challenge is how to validate, or ensure the adequacy of, the simulation representation of the combat scenario with the actual scenario on the basis of comparing simulation output to typically small-sample-size actual combat data. In this paper we give the details of a new statistical methodology for use in validating a mission-level, agent-based model of a historical combat scenario. We use the Bay of Biscay agent-based simulation and develop a bootstrapping technique applied to the small-sample-size actual data from the Bay of Biscay campaign. Results from this bootstrapping statistical technique are compared with commonly used statistical techniques and are shown to improve the comparison of agent-based simulation output to actual real-world data.
- Published
- 2009
- Full Text
- View/download PDF
38. 10.2.4 Persons in the Processes: Human Systems Integration in Early System Development
- Author
-
Raymond R. Hill, Nicholas Hardman, John M. Colombi, David R. Jacques, and Janet Miller
- Subjects
System of systems ,System development ,Schedule ,Engineering ,Risk analysis (engineering) ,Management science ,business.industry ,System of systems engineering ,Systems design ,Human factors integration ,System lifecycle ,business - Abstract
The systems engineering technical processes often lack the support of methods and tools that quantitatively integrate human considerations into early system design decisions. Because of this, engineers generally rely on qualitative judgments or delay critical decisions until late in the system lifecycle. Studies have revealed that this frequently results in cost, schedule, and effectiveness consequences. We begin this paper with a study of current issues in the application of systems engineering as a whole. We then examine how the challenges of human systems integration are a pervasive factor in those issues. Finally, we propose how to improve system engineering by better addressing the identified challenges.
- Published
- 2009
- Full Text
- View/download PDF
39. What Systems Engineers Need to Know About Human - Computer Interaction
- Author
-
David R. Jacques, Raymond R. Hill, John M. Colombi, and Nicholas Hardman
- Subjects
Engineering ,business.industry ,Human–computer interaction ,Need to know ,business - Published
- 2008
- Full Text
- View/download PDF
40. Visualization Tools for Prescriptive Analysis of a Complex Adaptive System Model of Pilot Retention
- Author
-
Raymond R. Hill and Martin P. Gaupp
- Subjects
Inventory control ,Engineering ,business.industry ,Adaptive method ,Complex system ,Avionics ,Decentralised system ,Visualization ,Hardware and Architecture ,Mechanics of Materials ,Modeling and Simulation ,Adaptive system ,Systems engineering ,Electrical and Electronic Engineering ,business ,Complex adaptive system ,Software ,Simulation - Abstract
This paper discusses a prototype agent-based simulation for examining United States Air Force pilot retention issues. The United States Air Force expends significant time and resources training pil...
- Published
- 2006
- Full Text
- View/download PDF
41. Devising a quick-running heuristic for an unmanned aerial vehicle (UAV) routing system
- Author
-
James T. Moore, G W Kinney, and Raymond R. Hill
- Subjects
Marketing ,021103 operations research ,Operations research ,Computer science ,Heuristic ,business.industry ,Strategy and Management ,Distributed computing ,0211 other engineering and technologies ,02 engineering and technology ,Management Science and Operations Research ,Travelling salesman problem ,Management Information Systems ,Scheduling (computing) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Local search (optimization) ,Heuristics ,business - Abstract
UAVs provide reconnaissance support for the US military and often need operational routes immediately; current practice involves manual route calculation that can involve hundreds of targets and a complex set of operational restrictions. Our research focused on providing an operational UAV routing system. This system required development of a reasonably effective, quick running routing heuristic. We present the statistical methodology used to devise a quick-running routing heuristic that provides reasonable solutions. We consider three candidate local search heuristic approaches, conduct an empirical analysis to parameterize each heuristic, competitively test each candidate heuristic, and provide statistical analysis on the performance of each candidate heuristic to include comparison of the results of the best candidate heuristic against a compilation of the best-known solutions for standard test problems. Our heuristic is a component of the final UAV routing system and provides the UAV operators a tool to perform their route development tasks quickly and efficiently.
- Published
- 2005
- Full Text
- View/download PDF
42. Building the Mobility Aircraft Availability Forecasting (MAAF) Simulation Model and Decision Support System
- Author
-
Patrick J. Vincent, Vikrant Chopra, Raymond R. Hill, Frank W. Ciarallo, Sriram Mahadevan, and Christoper S. Allen
- Subjects
Decision support system ,Engineering ,Mobility model ,021103 operations research ,Operations research ,business.industry ,Mobility system ,0211 other engineering and technologies ,Sample (statistics) ,02 engineering and technology ,Risk analysis (business) ,Modeling and Simulation ,Component (UML) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Aircraft maintenance ,Representation (mathematics) ,business ,Engineering (miscellaneous) - Abstract
The Mobility Aircraft Availability Forecasting (MAAF) model prototype development and study effort was initiated to help the United States Air Force Air Mobility Command (AMC) answer the question, “How can we accurately predict mission capable (MC) rates?” While perfect prediction of aircraft MC rates is not possible, we investigate a simulation-based risk analysis approach. Current prediction methods utilize “after the fact” analyses and user opinion, making it difficult to perform quick, accurate, and effective analyses of potential limiting factors and policy changes, particularly in time-sensitive situations. This paper describes the MAAF proof-of-concept model and decision support system built to provide AMC managers the dynamic, predictive tools needed to better forecast aircraft availability. The simulation component featured new capabilities for mobility modeling to include dynamic definition of the configuration of a mobility system, dynamic definition of the capabilities of the individual airbases within a mobility system, improved representation of the aircraft objects within the model, and a new approach to modeling aircraft maintenance including the realistic consideration of partially mission capable aircraft. The development efforts and sample experimental results are recounted in this paper.
- Published
- 2005
- Full Text
- View/download PDF
43. A Hybrid Intelligent Logistics Diagnostic Assistant
- Author
-
Raymond R. Hill, Vinu V. Panicker, Sriram Mahadevan, and Frank W. Ciarallo
- Subjects
Schedule ,Operations research ,Artificial neural network ,Process (engineering) ,Computer science ,business.industry ,Rule-based system ,Library and Information Sciences ,Fault detection and isolation ,Management Information Systems ,Domain (software engineering) ,Resource (project management) ,Pattern matching ,Software engineering ,business - Abstract
Artificial neural networks (ANN) are powerful tools for pattern matching and have proven useful in fault detection and diagnosis. Rule based systems (RBS) have the ability to process complex reasoning paths and explain the results of an analysis. The Hybrid Intelligent Logistics Diagnostic Assistant (HILDA) combines these technologies to augment the abilities of a logistics analyst. HILDA aids a logistics analyst working with the Mobility Aircraft Availability Forecasting (MAAF) simulation in trying to balance the requirements of a mission schedule versus the personnel and aircraft resources available at a set of airbases. This paper describes the design and implementation of the hybrid architecture, with details on the domain specific knowledge embedded in the ANN and RBS modules. Examples show how the HILDA prototype is successful in aiding a decision maker in converging to a successful mission/resource scenario.
- Published
- 2005
- Full Text
- View/download PDF
44. A Javatm universal vehicle router for routing unmanned aerial vehicles
- Author
-
James T. Moore, Raymond R. Hill, and R.W. Harder
- Subjects
Router ,Static routing ,Computer science ,Interface (Java) ,business.industry ,Strategy and Management ,Distributed computing ,Code reuse ,Management Science and Operations Research ,Solver ,Computer Science Applications ,Management of Technology and Innovation ,Embedded system ,Vehicle routing problem ,Business and International Management ,Routing (electronic design automation) ,User interface ,business - Abstract
We consider vehicle routing problems in the context of the Air Force operational problem of routing unmanned aerial vehicles from base locations to various reconnaissance sites. The unmanned aerial vehicle routing problem requires consideration of heterogeneous vehicles, vehicle endurance limits, time windows, and time walls for some of the sites requiring coverage, site priorities, and asymmetric travel distances. We propose a general architecture for operational research problems, specified for vehicle routing problems, that encourages object-oriented programming and code reuse. We create an instance of this architecture for the unmanned aerial vehicle routing problem and describe the components of this architecture to include the general user interface created for the operational users of the system. We employ route building heuristics and tabu search in a symbiotic fashion to provide a user-defined level-of-effort solver interface. Empirical tests of solution algorithms parameterized for solution speed reveal reasonable solution quality is attained.
- Published
- 2004
- Full Text
- View/download PDF
45. Using Agent-based Simulation and Game Theory to Examine the WWII Bay of Biscay U-boat Campaign
- Author
-
Raymond R. Hill, Lance E. Champagne, and Joseph C. Price
- Subjects
Engineering ,021103 operations research ,Operations research ,business.industry ,Management science ,World War II ,0211 other engineering and technologies ,Submarine ,020206 networking & telecommunications ,02 engineering and technology ,Agent-based social simulation ,Modeling and simulation ,Modeling and Simulation ,0202 electrical engineering, electronic engineering, information engineering ,business ,Engineering (miscellaneous) ,Game theory ,Bay ,Historical record - Abstract
This paper presents research combining an agent-based modeling and simulation paradigm with game theory for an in silico historical analysis of the Bay of Biscay submarine war during WWII. The U-boat threat was of great concern to the Allies, prompting initial operational research efforts to devise counterstrategies. Focusing search efforts in the Bay of Biscay enabled an effective Allied response. Using the historical record as a means to create a reasonably accurate model of the U-boat campaign, we allow the resulting agents within the model to adapt their strategies to counter-opposition strategies. Model output data are examined with respect to the historical record and game theory. The results hold promise for extending the agent-based modeling paradigm into more complex military-based domains.
- Published
- 2004
- Full Text
- View/download PDF
46. Process Simulation in Excel for a Quantitative Management Course
- Author
-
Raymond R. Hill
- Subjects
Process modeling ,Process (engineering) ,Computer science ,business.industry ,media_common.quotation_subject ,Simulation modeling ,Sampling (statistics) ,Context (language use) ,Management Science and Operations Research ,Nagging ,Education ,Management Information Systems ,Salient ,Process simulation ,Software engineering ,business ,Simulation ,media_common - Abstract
A nagging limitation of teaching spreadsheet-based quantitative decision-making courses is the sometimes stilted view of simulation presented, a view that overly emphasizes sampling simulation at the expense of process simulation. Without a viable spreadsheet-based process simulation package, the best efforts aimed at overcoming this emphasis on sampling simulation have provided passing references to special purpose simulation packages focused on process simulation. We introduce and discuss SimQuick as an alternative approach for teaching process simulation within the context of a spreadsheet-based approach to teaching simulation. Although an early tool, and somewhat limited when compared to special purpose simulation packages, SimQuick provides a viable means for teaching the process of simulation modeling and reinforcing the salient features of process modeling as a quantitative technique.
- Published
- 2002
- Full Text
- View/download PDF
47. A Methodology to Reduce Aerospace Ground Equipment Requirements for an Air Expeditionary Force
- Author
-
Raymond R. Hill, Frank C. O'Fearna, and J. O. Miller
- Subjects
Engineering ,Restructuring ,business.industry ,Strategy and Management ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Management Science and Operations Research ,Management Information Systems ,Aeronautics ,Software deployment ,Management of Technology and Innovation ,Maintenance actions ,Aircraft maintenance ,Operations management ,Business and International Management ,Aerospace ,business - Abstract
For many years the US military has maintained a significant overseas presence. Changes in the international political landscape has led to military reductions, particularly in this overseas military presence. To meet increased deployment demands from US-based stations the US Air Force is restructuring to accommodate an Expeditionary Aerospace Force (EAF) concept. As the United States Air Force evolves into an EAF using tailored combat force packages to meet specific threats, the requisite support functions for the deployed forces will garner increasing attention. We propose and examine a simulation-based methodology for examining options to reduce the amount of support equipment deployed with a tailored force. The simulation model is based on aircraft system failures and the ensuing maintenance actions. These maintenance actions drive the utilisation of aerospace ground equipment used to support aircraft maintenance. Support equipment deployment levels are re-examined using expected equipment utilisation ...
- Published
- 2002
- Full Text
- View/download PDF
48. Science of Test Research Consortium: Year Two Final Report
- Author
-
G. Geoff Vining, Raymond R. Hill, Douglas C. Montgomery, and Rachel T. Silvestrini
- Subjects
Virginia tech ,Body of knowledge ,Engineering ,Test management ,business.industry ,Informatics ,Library science ,Resource management ,Design systems ,business ,Test (assessment) - Abstract
The Science of Test (SOT) Research Consortium is sponsored and funded by both Office of Secretary of Defense (OSD) Director, Operational Test and Evaluation (DOT&E) and by the Test Resource Management Center (TRMC) within the OSD Director, Developmental Test and Evaluation. The consortium members include the Air Force Institute of Technology, Department of Operational Sciences, Arizona State University, School of Computing, Informatics, and Design Systems Engineering, Virginia Tech, Statistics Department and Naval Postgraduate School, Operations Research Department. The SOT research effort commenced in early CY 2011. This report summarizes the accomplishments through the second year of the research consortium. The consortium collectively contributes to the theory of the statistical rigor of test with contributions to the body of knowledge and to the practice of test with applied methods and consultations.
- Published
- 2012
- Full Text
- View/download PDF
49. Analysis of the influences of biological variance, measurement error, and uncertainty on retinal photothermal damage threshold studies
- Author
-
Raymond R. Hill, David A. Wooddell, and Christine Schubert-Kabban
- Subjects
Observational error ,Retinal pigment epithelium ,business.industry ,Retinal ,Variance (accounting) ,Photothermal therapy ,chemistry.chemical_compound ,medicine.anatomical_structure ,Optics ,chemistry ,medicine ,Environmental science ,Time domain ,Energy source ,business ,Biological system ,Energy (signal processing) - Abstract
Safe exposure limits for directed energy sources are derived from a compilation of known injury thresholds taken primarily from animal models and simulation data. The summary statistics for these experiments are given as exposure levels representing a 50% probability of injury, or ED50, and associated variance. We examine biological variance in focal geometries and thermal properties and the influence each has in singlepulse ED50 threshold studies for 514-, 694-, and 1064-nanometer laser exposures in the thermal damage time domain. Damage threshold is defined to be the amount of energy required for a retinal burn on at least one retinal pigment epithelium (RPE) cell measuring approximately 10 microns in diameter. Better understanding of experimental variance will allow for more accurate safety buffers for exposure limits and improve directed energy research methodology.
- Published
- 2012
- Full Text
- View/download PDF
50. Modeling training effects using a human performance taxonomy
- Author
-
Douglas Paul Meador and Raymond R. Hill
- Subjects
Engineering ,Aircraft ,media_common.quotation_subject ,Poison control ,Human Factors and Ergonomics ,Machine learning ,computer.software_genre ,Task (project management) ,Dreyfus model of skill acquisition ,Behavioral Neuroscience ,Interactivity ,Task Performance and Analysis ,Learning theory ,Performance prediction ,Humans ,Computer Simulation ,Set (psychology) ,Function (engineering) ,Applied Psychology ,media_common ,Stochastic Processes ,business.industry ,Teaching ,Retention, Psychology ,Models, Theoretical ,Military Science ,Artificial intelligence ,business ,computer - Abstract
Objective: The aim of this study was to characterize skill acquisition during training and skill retention as a function of training strategy, retention period, and task type in the form of a numerical model and then apply that model to make predictions of performance on an unknown task. Background: Complex systems require efficient and effective training programs for the humans who operate them in discontinuous fashion. Although there are several constructs for learning theory, models that enable analysts to predict training outcomes are needed during the design of training programs. Method: This study involved 60 participants who were trained on five tasks relevant to RQ-1 Predator unmanned aircraft system sensor operators by one of three strategies that represented a continuum of instructor interactivity. After training, performance data for all five tasks were collected. Participants completed the same tasks 30 or 60 days later to determine skill retention and the rate at which task proficiency was reacquired. Results: Models built from tasks that isolate human performance channels adequately predicted performance on a task that combined those channels. Conclusion: Models that predict performance on tasks that isolate human performance channels can be used to make predictions on tasks that draw on multiple channels. This model provided a distribution of performance data that was statistically similar to actual performance data. Application: System designers trained with human performance data on a set of tasks can apply those tasks’ characteristics to future tasks to make reasonably accurate performance predictions, thereby allowing the designers to make early decisions regarding training strategy to teach those tasks.
- Published
- 2011
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.