266 results on '"Jitesh H. Panchal"'
Search Results
52. A Modular Decision-centric Approach for Reusable Design Processes.
- Author
-
Jitesh H. Panchal, Marco Gero Fernández, Christiaan J. J. Paredis, Janet K. Allen, and Farrokh Mistree
- Published
- 2009
- Full Text
- View/download PDF
53. Understanding Communication and Collaboration in Social Product Development Through Social Network Analysis.
- Author
-
Dazhong Wu, David W. Rosen, Jitesh H. Panchal, and Dirk Schaefer
- Published
- 2016
- Full Text
- View/download PDF
54. An Interval-based Constraint Satisfaction (IBCS) Method for Decentralized, Collaborative Multifunctional Design.
- Author
-
Jitesh H. Panchal, Marco Gero Fernández, Christiaan J. J. Paredis, Janet K. Allen, and Farrokh Mistree
- Published
- 2007
- Full Text
- View/download PDF
55. Risk Mitigation for Dynamic State Estimation Against Cyber Attacks and Unknown Inputs.
- Author
-
Ahmad F. Taha, Junjian Qi, Jianhui Wang 0001, and Jitesh H. Panchal
- Published
- 2015
56. Dynamic State Estimation under Cyber Attacks: A Comparative Study of Kalman Filters and Observers.
- Author
-
Ahmad F. Taha, Junjian Qi, Jianhui Wang 0001, and Jitesh H. Panchal
- Published
- 2015
57. Designing for Technical Behaviour
- Author
-
Jitesh H. Panchal and Paul T. Grogan
- Published
- 2021
- Full Text
- View/download PDF
58. A Reusable and Executable Information Model of Experiments on Human Decision Making in Systems Engineering and Design
- Author
-
Guoxin Wang, Jitesh H. Panchal, Bao Yandi, Yan Yan, and Zhenjun Ming
- Subjects
Research design ,General Computer Science ,Process (engineering) ,Computer science ,Design of experiments ,Frame (networking) ,General Engineering ,computer.file_format ,Reuse ,Ontology (information science) ,decision making ,Behavioral experiments ,Information model ,information modeling ,Systems engineering ,Ontology ,Domain knowledge ,General Materials Science ,Executable ,ontology ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,computer ,lcsh:TK1-9971 - Abstract
Understanding decision making in design is becoming increasingly important within systems engineering and design research. While controlled experiments are used for testing hypotheses and are playing a significant role in developing theories, there is a lack of support for capturing and reusing knowledge associated with experimental design. The goal of this paper is to model the information required for designing and executing experiments, thereby enabling the reuse of past experimental designs. Using declarative formulations of four aspects of an experiment, namely, problem , process , participants and incentives , we extract the generic elements and standardize the structure of information as a Decision Experiment Design Support (DEDS) template, based on which an ontology is developed. The information necessary to execute an experiment is archived as DEDS templates and represented by a frame-based ontology that can be reused from one experiment to another. The approach is illustrated using two example experiments for studying the decision strategies of information acquisition, and the impact of domain knowledge on decisions. This paper offers a fundamental step towards reducing the barriers in designing and executing behavioral experiments for design and systems engineering research.
- Published
- 2020
59. Modeling Airlines’ Route Selection Decisions Under Competition: A Discrete-Games-Based Model
- Author
-
Daniel A. DeLaurentis, Jitesh H. Panchal, Kushal A. Moolchandani, and Joseph Thekinen
- Subjects
050210 logistics & transportation ,Operations research ,Linear programming ,Computer science ,Aviation ,business.industry ,05 social sciences ,Bayesian probability ,Aerospace Engineering ,Transportation ,Management, Monitoring, Policy and Law ,Affect (psychology) ,Support vector machine ,Competition (economics) ,Management of Technology and Innovation ,0502 economics and business ,Data file ,050207 economics ,business ,Safety Research ,Selection (genetic algorithm) ,Energy (miscellaneous) - Abstract
To analyze the effects of policies within the air transportation network, there is a need to model how policies affect the decisions made by airlines. Because airline decision making is based on pr...
- Published
- 2020
- Full Text
- View/download PDF
60. Evaluating Heuristics in Engineering Design: A Reinforcement Learning Approach
- Author
-
Karim A. ElSayed, Ilias Bilionis, and Jitesh H. Panchal
- Subjects
Computer science ,business.industry ,Reinforcement learning ,Artificial intelligence ,Heuristics ,Engineering design process ,business - Abstract
Heuristics are essential for addressing the complexities of engineering design processes. The goodness of heuristics is context-dependent. Appropriately tailored heuristics can enable designers to find good solutions efficiently, and inappropriate heuristics can result in cognitive biases and inferior design outcomes. While there have been several efforts at understanding which heuristics are used by designers, there is a lack of normative understanding about when different heuristics are suitable. Towards addressing this gap, this paper presents a reinforcement learning-based approach to evaluate the goodness of heuristics for three sub-problems commonly faced by designers: (1) learning the map between the design space and the performance space, (2) acquiring sequential information, and (3) stopping the information acquisition process. Using a multi-armed bandit formulation and simulation studies, we learn the suitable heuristics for these individual sub-problems under different resource constraints and problem complexities. Additionally, we learn the optimal heuristics for the combined problem (i.e., the one composing all three sub-problems), and we compare them to ones learned at the sub-problem level. The results of our simulation study indicate that the proposed reinforcement learning-based approach can be effective for determining the quality of heuristics for different problems, and how the effectiveness of the heuristics changes as a function of the designer’s preference (e.g., performance versus cost), the complexity of the problem, and the resources available.
- Published
- 2021
- Full Text
- View/download PDF
61. Co-Evolution of Communication and System Performance in Engineering Systems Design: A Stochastic Network-Behavior Dynamics Model
- Author
-
Zoe Szajnfarber, Jitesh H. Panchal, Ashish M. Chaudhari, and Erica Gralla
- Subjects
Mechanics of Materials ,Computer science ,Mechanical Engineering ,Dynamics (mechanics) ,Systems design ,Control engineering ,Network behavior ,Computer Graphics and Computer-Aided Design ,Computer Science Applications - Abstract
The socio-technical perspective on engineering system design emphasizes the mutual dynamics between interdisciplinary interactions and system design outcomes. How different disciplines interact with each other depends on technical factors such as design interdependence and system performance. On the other hand, the design outcomes are influenced by social factors such as the frequency of interactions and their distribution. Understanding this co-evolution can lead to not only better behavioral insights, but also efficient communication pathways. In this context, we investigate how to quantify the temporal influences of social and technical factors on interdisciplinary interactions and their influence on system performance. We present a stochastic network-behavior dynamics model that quantifies the design interdependence, discipline-specific interaction decisions, the evolution of system performance, as well as their mutual dynamics. We employ two datasets, one of student subjects designing an automotive engine and the other of NASA engineers designing a spacecraft. Then, we apply statistical Bayesian inference to estimate model parameters and compare insights across the two datasets. The results indicate that design interdependence and social network statistics both have strong positive effects on interdisciplinary interactions for the expert and student subjects alike. For the student subjects, an additional modulating effect of system performance on interactions is observed. Inversely, the total number of interactions, irrespective of their discipline-wise distribution, has a weak but statistically significant positive effect on system performance in both cases. However, excessive interactions mirrored with design interdependence and inflexible design space exploration reduce system performance. These insights support the case for open organizational boundaries as a way for increasing interactions and improving system performance.
- Published
- 2021
- Full Text
- View/download PDF
62. A Mixed-Method Analysis of Schedule and Cost Growth in Defense Acquisition Programs
- Author
-
Jitesh H. Panchal, Atharva Hans, Ilias Bilionis, and Ashish M. Chaudhari
- Subjects
Root (linguistics) ,Schedule ,Procurement ,Work (electrical) ,Operations research ,Computer science ,Python (programming language) ,Discount points ,Maturity (finance) ,computer ,computer.programming_language ,Test (assessment) - Abstract
Cost and schedule overruns are common in the procurement of large-scale defense acquisition programs. Current work focuses on identifying the root causes of cost growth and schedule delays in the defense acquisition programs. There is need for a mix of quantitative and qualitative analysis of cost and schedule overruns which takes into account program factor such as, technology maturity, design maturity, initial acquisition time, and program complexity. Such analysis requires an easy to access database for program-specific data about how an acquisition programs’ technical and financial characteristics vary over the time. To fulfill this need, the objective of this paper is twofold: (i) to develop a database of major US defense weapons programs which includes details of the technical and financial characteristics and how they vary over time, and (ii) to test various hypotheses about the interdependence of such characteristics using the collected data. To achieve the objective, we use a mixed-method analysis on schedule and cost growth data available in the U.S. Government Accountability Office’s (GAO’s) defense acquisitions annual assessments during the period 2003–2017. We extracted both analytical and textual data from original reports into Excel files and further created an easy to access database accessible from a Python environment. The analysis reveals that technology immaturity is the major driver of cost and schedule growth during the early stages of the acquisition programs while technical inefficiencies drive cost overruns and schedule delays during the later stages. Further, we find that the acquisition programs with longer initial length do not necessarily have higher greater cost growth. The dataset and the results provide a useful starting point for the research community for modeling cost and schedule overruns, and for practitioners to inform their systems acquisition processes.
- Published
- 2021
- Full Text
- View/download PDF
63. Design Engineering in the Age of Industry 4.0
- Author
-
Jitesh H. Panchal, Janet K. Allen, Farrokh Mistree, Sesh Commuri, Roger J. Jiao, Dirk Schaefer, and Jelena Milisavljevic-Syed
- Subjects
0209 industrial biotechnology ,Engineering ,Industry 4.0 ,business.industry ,Mechanical Engineering ,02 engineering and technology ,Computer Graphics and Computer-Aided Design ,Manufacturing engineering ,Computer Science Applications ,03 medical and health sciences ,020901 industrial engineering & automation ,0302 clinical medicine ,Mechanics of Materials ,030221 ophthalmology & optometry ,business - Abstract
Industry 4.0 is based on the digitization of manufacturing industries and has raised the prospect for substantial improvements in productivity, quality, and customer satisfaction. This digital transformation not only affects the way products are manufactured but also creates new opportunities for the design of products, processes, services, and systems. Unlike traditional design practices based on system-centric concepts, design for these new opportunities requires a holistic view of the human (stakeholder), artefact (product), and process (realization) dimensions of the design problem. In this paper we envision a “human-cyber-physical view of the systems realization ecosystem,” termed “Design Engineering 4.0 (DE4.0),” to reconceptualize how cyber and physical technologies can be seamlessly integrated to identify and fulfil customer needs and garner the benefits of Industry 4.0. In this paper, we review the evolution of Engineering Design in response to advances in several strategic areas including smart and connected products, end-to-end digital integration, customization and personalization, data-driven design, digital twins and intelligent design automation, extended supply chains and agile collaboration networks, open innovation, co-creation and crowdsourcing, product servitization and anything-as-a-service, and platformization for the sharing economy. We postulate that DE 4.0 will account for drivers such as Internet of Things, Internet of People, Internet of Services, and Internet of Commerce to deliver on the promise of Industry 4.0 effectively and efficiently. Further, we identify key issues to be addressed in DE 4.0 and engage the design research community on the challenges that the future holds.
- Published
- 2021
64. A Decision-Centric Framework for Modeling Evolutionary Complex Systems
- Author
-
Jitesh H. Panchal and Zhenghui Sha
- Subjects
Computer science ,Distributed computing ,Complex system - Abstract
The human society is facing new challenges, such as cyber-security, environmental safety, and energy sustainability, which cannot be solved by a single engineering product or system. When seeking ways to address the challenges in order to meet fundamental human needs, the solutions often lead to large-scale complex systems. In complex systems, there are large numbers of interactive entities that work together to form a system of value greater than the sum of the individuals.
- Published
- 2021
- Full Text
- View/download PDF
65. A Generative Network Model for Product Evolution.
- Author
-
Qize Le, Zhenghui Sha, and Jitesh H. Panchal
- Published
- 2014
- Full Text
- View/download PDF
66. Modeling the Effect of Product Architecture on Mass-Collaborative Processes.
- Author
-
Qize Le and Jitesh H. Panchal
- Published
- 2011
- Full Text
- View/download PDF
67. Analysis of the Structure and Evolution of an Open-Source Community.
- Author
-
Hao-Yun Huang, Qize Le, and Jitesh H. Panchal
- Published
- 2011
- Full Text
- View/download PDF
68. Extracting the Structure of Design Information From Collaborative Tagging.
- Author
-
Jitesh H. Panchal and Matthias Messer
- Published
- 2011
- Full Text
- View/download PDF
69. A Probabilistic Graphical Method for Quantifying Individuals' Theory-based Causal Knowledge
- Author
-
Atharva Hans, Chaudhari, Ashish M, Bilionis, Ilias, and Jitesh H Panchal
- Published
- 2021
- Full Text
- View/download PDF
70. Test and Evaluation Framework for Multi-Agent Systems of Autonomous Intelligent Agents
- Author
-
Ivan Hernandez, Scott Welch, Laura J. Freeman, Jitesh H. Panchal, Adam Dachowicz, Melanie L. Grande, Andrew B. Lang, Erin Lanus, and Anthony Patrick
- Subjects
FOS: Computer and information sciences ,Computer Science - Artificial Intelligence ,Process (engineering) ,Computer science ,Multi-agent system ,Statistical model ,Combinatorial interaction testing ,Systems and Control (eess.SY) ,computer.software_genre ,Electrical Engineering and Systems Science - Systems and Control ,Variety (cybernetics) ,Test (assessment) ,Software development process ,Software Engineering (cs.SE) ,Computer Science - Software Engineering ,Intelligent agent ,Artificial Intelligence (cs.AI) ,Systems engineering ,FOS: Electrical engineering, electronic engineering, information engineering ,computer - Abstract
Test and evaluation is a necessary process for ensuring that engineered systems perform as intended under a variety of conditions, both expected and unexpected. In this work, we consider the unique challenges of developing a unifying test and evaluation framework for complex ensembles of cyber-physical systems with embedded artificial intelligence. We propose a framework that incorporates test and evaluation throughout not only the development life cycle, but continues into operation as the system learns and adapts in a noisy, changing, and contended environment. The framework accounts for the challenges of testing the integration of diverse systems at various hierarchical scales of composition while respecting that testing time and resources are limited. A generic use case is provided for illustrative purposes and research directions emerging as a result of exploring the use case via the framework are suggested.
- Published
- 2021
- Full Text
- View/download PDF
71. Designing for Technical Behavior
- Author
-
Paul T. Grogan and Jitesh H. Panchal
- Subjects
Computer science - Published
- 2021
- Full Text
- View/download PDF
72. Agent-Based Modeling of Mass-Collaborative Product Development Processes.
- Author
-
Jitesh H. Panchal
- Published
- 2009
- Full Text
- View/download PDF
73. Managing Design-Process Complexity: A Value-of-Information Based Approach for Scale and Decision Decoupling.
- Author
-
Jitesh H. Panchal, Christiaan J. J. Paredis, Janet K. Allen, and Farrokh Mistree
- Published
- 2009
- Full Text
- View/download PDF
74. Special Issue: Analysis and Design of Sociotechnical Systems
- Author
-
Michel-Alexandre Cardin, Zoe Szajnfarber, Jitesh H. Panchal, Katja Hölttä-Otto, Gül E. Okudan Kremer, and Babak Heydari
- Subjects
Technology ,Engineering ,Science & Technology ,Sociotechnical system ,business.industry ,Mechanical Engineering ,1203 Design Practice and Management ,Design Practice & Management ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,Engineering, Mechanical ,Mechanics of Materials ,Systems engineering ,business ,0913 Mechanical Engineering - Published
- 2020
- Full Text
- View/download PDF
75. Designing Representative Model Worlds to Study Socio-Technical Phenomena: A Case Study of Communication Patterns in Engineering Systems Design
- Author
-
Jitesh H. Panchal, Erica Gralla, Zoe Szajnfarber, Paul T. Grogan, and Ashish M. Chaudhari
- Subjects
021103 operations research ,Sociotechnical system ,Mechanics of Materials ,Management science ,Computer science ,Mechanical Engineering ,0211 other engineering and technologies ,Systems design ,02 engineering and technology ,Computer Graphics and Computer-Aided Design ,021106 design practice & management ,Computer Science Applications - Abstract
The engineering of complex systems, such as aircraft and spacecraft, involves large number of individuals within multiple organizations spanning multiple years. Since it is challenging to perform empirical studies directly on real organizations at scale, some researchers in systems engineering and design have begun relying on abstracted model worlds that aim to be representative of the reference socio-technical system, but only preserve some aspects of it. However, there is a lack of corresponding knowledge on how to design representative model worlds for socio-technical research. Our objective is to create such knowledge through a reflective case study of the development of a model world. This “inner” study examines how two factors influence interdisciplinary communication during a concurrent design process. The reference real world system is a mission design laboratory (MDL) at NASA, and the model world is a simplified engine design problem in an undergraduate classroom environment. Our analysis focuses on the thought process followed, the key model world design decisions made, and a critical assessment of the extent to which communication phenomena in the model world (engine experiment) are representative of the real world (NASA’s MDL). We find that the engine experiment preserves some but not all of the communication patterns of interest, and we present case-specific lessons learned for achieving and increasing representativeness in this type of study. More generally, we find that representativeness depends not on matching subjects, tasks, and context separately, but rather on the behavior that emerges from the interplay of these three dimensions.
- Published
- 2020
- Full Text
- View/download PDF
76. Sequential Design Decision Making Under the Influence of Competition: A Protocol Analysis
- Author
-
Jitesh H. Panchal and Murtuza N. Shergadwala
- Subjects
Competition (economics) ,Operations research ,Sequential analysis ,Computer science ,Protocol analysis - Abstract
In this study, we focus on crowdsourcing contests for engineering design problems where contestants search for design alternatives. Our stakeholder is a designer of such a contest who requires support to make decisions, such as whether to share opponent-specific information with the contestants. There is a significant gap in our understanding of how sharing opponent-specific information influences a contestant’s information acquisition decision such as whether to stop searching for design alternatives. Such decisions in turn affect the outcomes of a design contest. To address this gap, the objective of this study is to investigate how participants’ decision to stop searching for a design solution is influenced by the knowledge about their opponent’s past performance. The objective is achieved by conducting a protocol study where participants are interviewed at the end of a behavioral experiment. In the experiment, participants compete against opponents with strong (or poor) performance records. We find that individuals make decisions to stop acquiring information based on various thresholds such as a target design quality, the number of resources they want to spend, and the amount of design objective improvement they seek in sequential search. The threshold values for such stopping criteria are influenced by the contestant’s perception about the competitiveness of their opponent. Such insights can enable contest designers to make decisions about sharing opponent-specific information with participants, such as the resources utilized by the opponent towards purposefully improving the outcomes of an engineering design contest.
- Published
- 2020
- Full Text
- View/download PDF
77. Quantifying Individuals’ Theory-Based Knowledge Using Probabilistic Causal Graphs: A Bayesian Hierarchical Approach
- Author
-
Jitesh H. Panchal, Atharva Hans, Ilias Bilionis, and Ashish M. Chaudhari
- Subjects
Causal graph ,Theoretical computer science ,Computer science ,Bayesian probability ,Probabilistic logic ,Theory based - Abstract
Extracting an individual’s knowledge structure is a challenging task as it requires formalization of many concepts and their interrelationships. While there has been significant research on how to represent knowledge to support computational design tasks, there is limited understanding of the knowledge structures of human designers. This understanding is necessary for comprehension of cognitive tasks such as decision making and reasoning, and for improving educational programs. In this paper, we focus on quantifying theory-based causal knowledge, which is a specific type of knowledge held by human designers. We develop a probabilistic graph-based model for representing individuals’ concept-specific causal knowledge for a given theory. We propose a methodology based on probabilistic directed acyclic graphs (DAGs) that uses logistic likelihood function for calculating the probability of a correct response. The approach involves a set of questions for gathering responses from 205 engineering students, and a hierarchical Bayesian approach for inferring individuals’ DAGs from the observed responses. We compare the proposed model to a baseline three-parameter logistic (3PL) model from the item response theory. The results suggest that the graph-based logistic model can estimate individual students’ knowledge graphs. Comparisons with the 3PL model indicate that knowledge assessment is more accurate when quantifying knowledge at the level of causal relations than quantifying it using a scalar ability parameter. The proposed model allows identification of parts of the curriculum that a student struggles with and parts they have already mastered which is essential for remediation.
- Published
- 2020
- Full Text
- View/download PDF
78. Human Inductive Biases in Design Decision Making
- Author
-
Jitesh H. Panchal and Murtuza N. Shergadwala
- Abstract
Designers make information acquisition decisions, such as where to search and when to stop the search. Such decisions are typically made sequentially, such that at every search step designers gain information by learning about the design space. However, when designers begin acquiring information, their decisions are primarily based on their prior knowledge. Prior knowledge influences the initial set of assumptions that designers use to learn about the design space. These assumptions are collectively termed as inductive biases. Identifying such biases can help us better understand how designers use their prior knowledge to solve problems in the light of uncertainty. Thus, in this study, we identify inductive biases in humans in sequential information acquisition tasks. To do so, we analyze experimental data from a set of behavioral experiments conducted in the past [1–5]. All of these experiments were designed to study various factors that influence sequential information acquisition behaviors. Across these studies, we identify similar decision making behaviors in the participants in their very first decision to “choose x”. We find that their choices of “x” are not uniformly distributed in the design space. Since such experiments are abstractions of real design scenarios, it implies that further contextualization of such experiments would only increase the influence of these biases. Thus, we highlight the need to study the influence of such biases to better understand designer behaviors. We conclude that in the context of Bayesian modeling of designers’ behaviors, utilizing the identified inductive biases would enable us to better model designer’s priors for design search contexts as compared to using non-informative priors.
- Published
- 2020
- Full Text
- View/download PDF
79. Investigating the Challenges of Crowdsourcing for Engineering Design: An Interview Study With Organizations of Different Sizes
- Author
-
Jitesh H. Panchal, Hannah Forbes, Dirk Shaefer, and Murtuza N. Shergadwala
- Subjects
Knowledge management ,Leverage (negotiation) ,business.industry ,ComputerApplications_MISCELLANEOUS ,Open design ,Interview study ,Small and medium-sized enterprises ,Business ,Business model ,Engineering design process ,Crowdsourcing ,Variety (cybernetics) - Abstract
© 2020 American Society of Mechanical Engineers (ASME). All rights reserved. Crowdsourcing has been identified as a valuable paradigm in the open design movement. In engineering design, it offers various benefits, such as the generation of diverse ideas and the involvement of consumers. Despite the potential benefits, there are many ways in which crowdsourcing initiatives may fail. An example of such a failure is when a previously successful initiative for a large organization fails to attract a suitable number of participants with diverse expertise for a start-up. Consequently, the start-up does not receive good sets of ideas, both in quantity and variety. Such failures of crowdsourcing initiatives are common due to the lack of appropriate design of crowdsourcing initiatives based on the organizational characteristics such as its size. While frameworks and guidelines exist for the design of crowdsourcing initiatives, whether these are useful for all sizes of organizations, is yet to be determined. Large organizations such as Procter & Gamble and NASA, now conduct crowdsourcing initiatives regularly. Furthermore, start-ups are emerging that leverage crowdsourcing as an integral part of their business model. On the contrary, small and medium enterprises (SMEs) have fallen behind in the adoption of crowdsourcing processes. In this paper, we aim to identify the challenges associated with crowdsourcing and how and whether these differ according to organizational size. We present the results of an interview study with industry professionals from five organizations of varying sizes, and yield key challenges associated with the application of crowdsourcing. This paper discusses suggested support mechanisms for crowdsourcing in SMEs and directions for further research for crowdsourcing in engineering design.
- Published
- 2020
- Full Text
- View/download PDF
80. Risk Mitigation for Dynamic State Estimation Against Cyber Attacks and Unknown Inputs
- Author
-
Jianhui Wang, Ahmad F. Taha, Junjian Qi, and Jitesh H. Panchal
- Subjects
FOS: Computer and information sciences ,Computer Science - Cryptography and Security ,General Computer Science ,Computer science ,business.industry ,020209 energy ,Phasor ,Systems and Control (eess.SY) ,02 engineering and technology ,Grid ,Power (physics) ,Reliability engineering ,Units of measurement ,Electric power system ,Optimization and Control (math.OC) ,FOS: Electrical engineering, electronic engineering, information engineering ,FOS: Mathematics ,0202 electrical engineering, electronic engineering, information engineering ,Computer Science - Systems and Control ,Observability ,State (computer science) ,business ,Cryptography and Security (cs.CR) ,Mathematics - Optimization and Control ,Risk management - Abstract
Phasor measurement units (PMUs) can be effectively utilized for the monitoring and control of the power grid. As the cyber-world becomes increasingly embedded into power grids, the risks of this inevitable evolution become serious. In this paper, we present a risk mitigation strategy, based on dynamic state estimation, to eliminate threat levels from the grid’s unknown inputs and potential cyber-attacks. The strategy requires: 1) the potentially incomplete knowledge of power system models and parameters and 2) real-time PMU measurements. First, we utilize a dynamic state estimator for higher order depictions of power system dynamics for simultaneous state and unknown inputs estimation. Second, estimates of cyber-attacks are obtained through an attack detection algorithm. Third, the estimation and detection components are seamlessly utilized in an optimization framework to determine the most impacted PMU measurements. Finally, a risk mitigation strategy is proposed to guarantee the elimination of threats from attacks, ensuring the observability of the power system through available, safe measurements. Case studies are included to validate the proposed approach. Insightful suggestions, extensions, and open problems are also posed.
- Published
- 2018
- Full Text
- View/download PDF
81. STOCHASTIC MULTIOBJECTIVE OPTIMIZATION ON A BUDGET: APPLICATION TO MULTIPASS WIRE DRAWING WITH QUANTIFIED UNCERTAINTIES
- Author
-
Ilias Bilionis, Piyush Pandita, Jitesh H. Panchal, B. P. Gautham, Pramod Zagade, and Amol Joshi
- Subjects
Statistics and Probability ,0209 industrial biotechnology ,Mathematical optimization ,Control and Optimization ,Computer science ,010103 numerical & computational mathematics ,02 engineering and technology ,01 natural sciences ,Multi-objective optimization ,symbols.namesake ,020901 industrial engineering & automation ,FOS: Mathematics ,Discrete Mathematics and Combinatorics ,Information acquisition ,0101 mathematics ,Uncertainty quantification ,Mathematics - Optimization and Control ,Gaussian process ,Wire drawing ,Probability (math.PR) ,Bayesian optimization ,Optimization and Control (math.OC) ,Modeling and Simulation ,symbols ,Stochastic optimization ,Mathematics - Probability - Abstract
Design optimization of engineering systems with multiple competing objectives is a painstakingly tedious process especially when the objective functions are expensive-to-evaluate computer codes with parametric uncertainties. The effectiveness of the state-of-the-art techniques is greatly diminished because they require a large number of objective evaluations, which makes them impractical for problems of the above kind. Bayesian global optimization (BGO), has managed to deal with these challenges in solving single-objective optimization problems and has recently been extended to multi-objective optimization (MOO). BGO models the objectives via probabilistic surrogates and uses the epistemic uncertainty to define an information acquisition function (IAF) that quantifies the merit of evaluating the objective at new designs. This iterative data acquisition process continues until a stopping criterion is met. The most commonly used IAF for MOO is the expected improvement over the dominated hypervolume (EIHV) which in its original form is unable to deal with parametric uncertainties or measurement noise. In this work, we provide a systematic reformulation of EIHV to deal with stochastic MOO problems. The primary contribution of this paper lies in being able to filter out the noise and reformulate the EIHV without having to observe or estimate the stochastic parameters. An addendum of the probabilistic nature of our methodology is that it enables us to characterize our confidence about the predicted Pareto front. We verify and validate the proposed methodology by applying it to synthetic test problems with known solutions. We demonstrate our approach on an industrial problem of die pass design for a steel wire drawing process., 19 pages, 14 figures
- Published
- 2018
- Full Text
- View/download PDF
82. Surrogate-based sequential Bayesian experimental design using non-stationary Gaussian Processes
- Author
-
Ilias Bilionis, Jitesh H. Panchal, Nimish M. Awalgaonkar, Panagiotis Tsilifis, and Piyush Pandita
- Subjects
Optimal design ,Mathematical optimization ,Optimality criterion ,Computer science ,Mechanical Engineering ,Monte Carlo method ,Bayesian probability ,Computational Mechanics ,General Physics and Astronomy ,Estimator ,Computer Science Applications ,symbols.namesake ,Mechanics of Materials ,Sequential analysis ,Bayesian experimental design ,symbols ,Gaussian process - Abstract
Inferring arbitrary quantities of interest (QoI) using limited computational or, in realistic scenarios, financial budgets, is a challenging problem that requires sophisticated strategies for the optimal allocation of the available resources. Bayesian optimal experimental design identifies the optimal set of design locations for the purpose of solving a parameter inference problem and the optimality criterion is typically associated with maximizing the worth of information in the experimental measurements. Sequential design strategies further identify the optimal design in a sequential manner, starting from a initial budget and iteratively selecting new optimal points until either an accuracy threshold is reached, or a cost limit is exceeded. In this paper, we present a generic sequential Bayesian experimental design framework that relies on maximizing an information theoretic design criterion, namely the Expected Information Gain, in order to infer QoIs formed as nonlinear operators acting on black-box functions. Our framework relies on modeling the underlying response function using non-stationary Gaussian Processes, thus enabling efficient sampling from the QoI in order to provide Monte Carlo estimators for the design criterion. We demonstrate the performance of our method on an engineering problem of steel wire manufacturing and compare it with two classic approaches: uncertainty sampling and expected improvement.
- Published
- 2021
- Full Text
- View/download PDF
83. Microstructure-Based Counterfeit Detection in Metal Part Manufacturing
- Author
-
Jitesh H. Panchal, Mikhail J. Atallah, Adam Dachowicz, and Siva Chaitanya Chaduvula
- Subjects
0301 basic medicine ,0209 industrial biotechnology ,Engineering drawing ,Engineering ,business.industry ,Wear and tear ,Physical unclonable function ,General Engineering ,02 engineering and technology ,Microstructure ,Reliability engineering ,Counterfeit ,03 medical and health sciences ,030104 developmental biology ,020901 industrial engineering & automation ,Robustness (computer science) ,Production (economics) ,General Materials Science ,business ,Protocol (object-oriented programming) ,Randomness - Abstract
Counterfeiting in metal part manufacturing has become a major global concern. Although significant effort has been made in detecting the implementation of such counterfeits, modern approaches suffer from high expense during production, invasiveness during manufacture, and unreliability in practice if parts are damaged during use. In this paper, a practical microstructure-based counterfeit detection methodology is proposed, which draws on inherent randomness present in the microstructure as a result of the manufacturing process. An optical Physically Unclonable Function (PUF) protocol is developed which takes a micrograph as input and outputs a compact, unique string representation of the micrograph. The uniqueness of the outputs and their robustness to moderate wear and tear is demonstrated by application of the methodology to brass samples. The protocol is shown to have good discriminatory power even between samples manufactured in the same batch, and runs on the order of several seconds per part on inexpensive machines.
- Published
- 2017
- Full Text
- View/download PDF
84. Resource allocation in cloud-based design and manufacturing: A mechanism design approach
- Author
-
Joseph Thekinen and Jitesh H. Panchal
- Subjects
0209 industrial biotechnology ,Service (systems architecture) ,Cloud-based design and manufacturing ,Matching (statistics) ,Mathematical optimization ,Optimal matching ,Computer science ,02 engineering and technology ,Pareto efficiency ,Service provider ,Industrial and Manufacturing Engineering ,020901 industrial engineering & automation ,Resource (project management) ,Hardware and Architecture ,Control and Systems Engineering ,0202 electrical engineering, electronic engineering, information engineering ,Resource allocation ,020201 artificial intelligence & image processing ,Software - Abstract
The focus of this paper is on matching service seekers and service providers, such as designers and machine owners, in cloud-based design and manufacturing (CBDM). In such decentralized scenarios the objectives and preferences of service seekers are different from those of service providers. Current resource configuration methods are unsuitable because they optimize the objectives of only one type of participants – either service seekers or service providers. Existing marketplaces based on first-come-first-serve (FCFS) approach are inefficient because they may not result in optimal matches. To address these limitations there is a need for mechanisms that result in optimal matching considering the private preferences of all the agents. In this paper, we formulate the resource allocation problem as a bipartite matching problem. Four bipartite matching mechanisms, namely, Deferred Acceptance (DA), Top Trading Cycle (TTC), Munkres, and FCFS are analyzed with respect to desired properties of the mechanisms such as individual rationality, stability, strategy proofness, consistency, monotonicity and Pareto efficiency. Further, the performance of these mechanisms is evaluated under different levels of resource availability through simulation studies. The appropriateness of matching mechanisms for different scenarios in CBDM such as fully decentralized, partially decentralized and totally monopolistic are assessed. Based on the analysis, we conclude that DA is the best mechanism for totally decentralized scenario, TTC is most appropriate when cloud-based resources are used in an organizational scenario, and Munkres is the best mechanism when all resources are owned by a single agent.
- Published
- 2017
- Full Text
- View/download PDF
85. Descriptive Models of Sequential Decisions in Engineering Design: An Experimental Study
- Author
-
Jitesh H. Panchal, Ilias Bilionis, and Ashish M. Chaudhari
- Subjects
010104 statistics & probability ,Mechanics of Materials ,Computer science ,Management science ,Mechanical Engineering ,05 social sciences ,0501 psychology and cognitive sciences ,0101 mathematics ,Engineering design process ,01 natural sciences ,Computer Graphics and Computer-Aided Design ,050105 experimental psychology ,Computer Science Applications - Abstract
Engineering design involves information acquisition decisions such as selecting designs in the design space for testing, selecting information sources, and deciding when to stop design exploration. Existing literature has established normative models for these decisions, but there is lack of knowledge about how human designers make these decisions and which strategies they use. This knowledge is important for accurately modeling design decisions, identifying sources of inefficiencies, and improving the design process. Therefore, the primary objective in this study is to identify models that provide the best description of a designer’s information acquisition decisions when multiple information sources are present and the total budget is limited. We conduct a controlled human subject experiment with two independent variables: the amount of fixed budget and the monetary incentive proportional to the saved budget. By using the experimental observations, we perform Bayesian model comparison on various simple heuristic models and expected utility (EU)-based models. As expected, the subjects’ decisions are better represented by the heuristic models than the EU-based models. While the EU-based models result in better net payoff, the heuristic models used by the subjects generate better design performance. The net payoff using heuristic models is closer to the EU-based models in experimental treatments where the budget is low and there is incentive for saving the budget. This indicates the potential for nudging designers’ decisions toward maximizing the net payoff by setting the fixed budget at low values and providing monetary incentives proportional to saved budget.
- Published
- 2020
- Full Text
- View/download PDF
86. Design for Crashworthiness of Categorical Multimaterial Structures Using Cluster Analysis and Bayesian Optimization
- Author
-
Kai Liu, Duane Detwiler, Tong Wu, Andres Tovar, and Jitesh H. Panchal
- Subjects
010302 applied physics ,Computer science ,Mechanical Engineering ,Bayesian optimization ,computer.software_genre ,01 natural sciences ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,010101 applied mathematics ,Conceptual design ,Mechanics of Materials ,0103 physical sciences ,Cluster (physics) ,Crashworthiness ,Data mining ,0101 mathematics ,Categorical variable ,computer - Abstract
This work introduces a cluster-based structural optimization (CBSO) method for the design of categorical multimaterial structures subjected to crushing, dynamic loading. The proposed method consists of three steps: conceptual design generation, design clustering, and Bayesian optimization. In the first step, a conceptual design is generated using the hybrid cellular automaton (HCA) algorithm. In the second step, threshold-based cluster analysis yields a lower-dimensional design. Here, a cluster validity index for structural optimization is introduced in order to qualitatively evaluate the clustered design. In the third step, the optimal design is obtained through Bayesian optimization, minimizing a constrained expected improvement function. This function allows to impose soft constraints by properly redefining the expected improvement based on the maximum constraint violation. The Bayesian optimization algorithm implemented in this work has the ability to search over (i) a real design space for sizing optimization, (ii) a categorical design space for material selection, or (iii) a mixed design space for concurrent sizing optimization and material selection. With the proposed method, materials are optimally selected based on multiple attributes and multiple objectives without the need for material ranking. The effectiveness of this approach is demonstrated with the design for crashworthiness of multimaterial plates and thin-walled structures.
- Published
- 2019
- Full Text
- View/download PDF
87. Students’ Decision-Making in a Product Design Process: An Observational Study
- Author
-
Karthik Ramani, Jitesh H. Panchal, and Murtuza N. Shergadwala
- Subjects
Product design ,Computer science ,Computer Aided Design ,Observational study ,computer.software_genre ,computer ,Manufacturing engineering - Abstract
The objective of this study is to investigate students’ decision-making during the information gathering activities of a design process. Existing literature in engineering education has shown that students face difficulties while gathering information in various activities of a design process such as brainstorming and CAD modeling. Decision-making is an important aspect of these activities. While gathering information, students make several decisions such as what information to acquire and how to acquire that information. There lies a research gap in understanding how students make decisions while gathering information in a product design process. To address this gap, we conduct semi-structured interviews and surveys in a product design course. We analyze the students’ decision-making activities from the lens of a sequential information acquisition and decision-making (SIADM) framework. We find that the students recognize the need to acquire information about the physics and dynamics of their design artifact during the CAD modeling activity of the product design process. However, they do not acquire such information from their CAD models primarily due to the lack of the project requirements, their ability, and the time to do so. Instead, they acquire such information from the prototyping activity as their physical prototype does not satisfy their design objectives. However, the students do not get the opportunity to iterate their prototype with the given cost and time constraints. Consequently, they rely on improvising during prototyping. Based on our observations, we discuss the need for designing course project activities such that it facilitates students’ product design decisions.
- Published
- 2019
- Full Text
- View/download PDF
88. Similarity in Engineering Design: A Knowledge-Based Approach
- Author
-
Ashish M. Chaudhari, Jitesh H. Panchal, and Ilias Bilionis
- Subjects
Similarity (network science) ,business.industry ,Computer science ,Artificial intelligence ,business ,Engineering design process ,Machine learning ,computer.software_genre ,computer - Abstract
Similarity assessment is a cognitive activity that pervades engineering design practice, research, and education. There has been a significant effort in understanding similarity in cognitive science, and some recent efforts on quantifying the similarity of design problems in the engineering design community. However, there is a lack of approaches for measuring similarity in engineering design that embody the characteristics identified in cognitive science, and accounts for the nature of design activities, particularly in the embodiment design phase where scientific knowledge plays a significant role. To address this gap, we present an approach for measuring the similarity among design problems. The approach consists of (i) modeling knowledge using probabilistic graphical models, (ii) modeling the functional mapping between design characteristics and the performance measures relevant in a particular context, and (iii) modeling the dissimilarity using KL-divergence in the performance space. We illustrate the approach using an example of a parametric shaft design for fatigue, which is typically a part of mechanical engineering design curricula, and test the validity of the approach using an experiment study involving 167 student subjects. The results indicate that the proposed approach can capture the well-documented characteristics of similarity, including directionality, context dependence, individual-specificity, and its dynamic nature. The approach is general enough that it can be extended further for assessing the similarity of design problems for analogical design, for assessing the similarity of experimental design tasks to real design settings, and for evaluating the similarity between design problems in educational settings.
- Published
- 2019
- Full Text
- View/download PDF
89. Spacecraft Failure Analysis From the Perspective of Design Decision-Making
- Author
-
Vikranth Reddy Kattakuri and Jitesh H. Panchal
- Subjects
Spacecraft ,Computer science ,business.industry ,Perspective (graphical) ,Systems engineering ,Process control ,Failure data ,business - Abstract
Space mission-related projects are demanding and risky undertakings because of their complexity and cost. Many missions have failed over the years due to anomalies in either the launch vehicle or the spacecraft. Projects of such magnitude with undetected flaws due to ineffective process controls account for huge losses. Such failures continue to occur despite the studies on systems engineering process deficiencies and the state-of-the-art systems engineering practices in place. To further explore the reasons behind majority of the failures, we analyzed the failure data of space missions that happened over the last decade. Based on that information, we studied the launch-related failure events from a design decision-making perspective by employing failure event chain-based framework and identified some dominant cognitive biases that might have impacted the overall system performance leading to unintended catastrophes. The results of the study are presented in this paper.
- Published
- 2019
- Full Text
- View/download PDF
90. Extraction and Analysis of Spatial Correlation Micrograph Features for Traceability in Manufacturing
- Author
-
Mikhail J. Atallah, Adam Dachowicz, and Jitesh H. Panchal
- Subjects
Spatial correlation ,Materials science ,Micrograph ,Traceability ,Computer science ,business.industry ,Feature extraction ,String (computer science) ,Extraction (chemistry) ,0211 other engineering and technologies ,Pattern recognition ,02 engineering and technology ,021001 nanoscience & nanotechnology ,Computer Graphics and Computer-Aided Design ,Industrial and Manufacturing Engineering ,Computer Science Applications ,law.invention ,Optical microscope ,law ,Artificial intelligence ,0210 nano-technology ,business ,Biological system ,Software ,021102 mining & metallurgy - Abstract
We propose a method for ensuring traceability of metal goods in an efficient and secure manner that leverages data obtained from micrographs of a part’s surface that is instance specific (i.e., different for another instance of that same part). All stakeholders in modern supply chains face a growing need to ensure quality and trust in the goods they produce. Complex supply chains open many opportunities for counterfeiters, saboteurs, or other attackers to infiltrate supply networks, and existing methods for preventing such attacks can be costly, invasive, and ineffective. The proposed method extracts discriminatory-yet-robust intrinsic strings using features extracted from the two-point autocorrelation data of surface microstructures, as well as from local volume fraction data. By using a synthetic dataset of three-phase micrographs similar to those obtained from metal alloy systems using low-cost optical microscopy techniques, we discuss tailoring the method with respect to cost and security and discuss the performance of the method in the context of anticounterfeiting and how similar methods may be evaluated for performance. Cryptographic extensions of this methodology are also discussed.
- Published
- 2019
- Full Text
- View/download PDF
91. Bayesian Optimal Design of Experiments for Inferring the Statistical Expectation of Expensive Black-Box Functions
- Author
-
Ilias Bilionis, Jitesh H. Panchal, and Piyush Pandita
- Subjects
Computer science ,Mechanical Engineering ,010103 numerical & computational mathematics ,01 natural sciences ,Computer Graphics and Computer-Aided Design ,Computer Science Applications ,010104 statistics & probability ,Mechanics of Materials ,Bayesian optimal design ,Black box ,Probability distribution ,Response surface methodology ,0101 mathematics ,Algorithm - Abstract
Bayesian optimal design of experiments (BODEs) have been successful in acquiring information about a quantity of interest (QoI) which depends on a black-box function. BODE is characterized by sequentially querying the function at specific designs selected by an infill-sampling criterion. However, most current BODE methods operate in specific contexts like optimization, or learning a universal representation of the black-box function. The objective of this paper is to design a BODE for estimating the statistical expectation of a physical response surface. This QoI is omnipresent in uncertainty propagation and design under uncertainty problems. Our hypothesis is that an optimal BODE should be maximizing the expected information gain in the QoI. We represent the information gain from a hypothetical experiment as the Kullback–Liebler (KL) divergence between the prior and the posterior probability distributions of the QoI. The prior distribution of the QoI is conditioned on the observed data, and the posterior distribution of the QoI is conditioned on the observed data and a hypothetical experiment. The main contribution of this paper is the derivation of a semi-analytic mathematical formula for the expected information gain about the statistical expectation of a physical response. The developed BODE is validated on synthetic functions with varying number of input-dimensions. We demonstrate the performance of the methodology on a steel wire manufacturing problem.
- Published
- 2019
- Full Text
- View/download PDF
92. Architecting Fail‐Safe Supply Networks
- Author
-
Shabnam Rezapour, Amirhossein Khosrojerdi, Golnoosh Rasoulifar, Janet K. Allen, Jitesh H. Panchal, Ramakrishnan S. Srinivasan, Jeffrey D. Tew, Farrokh Mistree, Shabnam Rezapour, Amirhossein Khosrojerdi, Golnoosh Rasoulifar, Janet K. Allen, Jitesh H. Panchal, Ramakrishnan S. Srinivasan, Jeffrey D. Tew, and Farrokh Mistree
- Subjects
- Business logistics, Materials management
- Abstract
A fail-safe supply network is designed to mitigate the impact of variations and disruptions on people and corporations. This is achieved by (1) developing a network structure to mitigate the impact of disruptions that distort the network structure and (2) planning flow through the network to neutralize the effects of variations. In this monograph, we propose a framework, develop mathematical models and provide examples of fail-safe supply network design. We show that, contrary to current thinking as embodied in the supply network literature, disruption management decisions made at the strategic network design level are not independent from variation management decisions made at the operational level. Accordingly, we suggest that it is beneficial to manage disruptions and variations concurrently in supply networks. This is achieved by architecting fail-safe supply networks, which are characterized by the following elements: reliability, robustness, flexibility, structural controllability, and resilience. Organizations can use the framework presented in this monograph to manage variations and disruptions. Managers can select the best operational management strategies for their supply networks considering variations in supply and demand, and identify the best network restoration strategies including facility fortification, backup inventory, flexible production capacity, flexible inventory, and transportation route reconfiguration. The framework is generalizable to other complex engineered networks.
- Published
- 2019
93. Ontology-based executable design decision template representation and reuse
- Author
-
Jitesh H. Panchal, Guoxin Wang, Yan Yan, Janet K. Allen, Farrokh Mistree, Zhenjun Ming, and Chung-Hyun Goh
- Subjects
0209 industrial biotechnology ,Decision support system ,Computer science ,0211 other engineering and technologies ,Context (language use) ,02 engineering and technology ,Ontology (information science) ,Reuse ,computer.software_genre ,Industrial and Manufacturing Engineering ,Consistency (database systems) ,020901 industrial engineering & automation ,Artificial Intelligence ,Decision-making ,021106 design practice & management ,business.industry ,Programming language ,Principal (computer security) ,Construct (python library) ,computer.file_format ,Modular design ,Visualization ,Data mining ,Executable ,Software engineering ,business ,computer - Abstract
The Decision Support Problem (DSP) construct is anchored in the notion that design is fundamentally a decision making process. Key is the concept of two types of decisions (namely, selection and compromise) and that any complex design can be represented through modelling a network of compromise and selection decisions. In a computational environment the DSPs are modeled as decision templates. In this paper, modular, executable, reusable decision templates are proposed as a means to effect design and to archive design-related knowledge on a computer. In the context of the compromise Decision Support Problem (cDSP) we address two questions: 1. What are the salient features for facilitating the reuse of design decision templates? 2. What are the salient features that facilitate maintaining model consistency when reusing design decision templates? Here, the first question is answered by the identification of reuse patterns in which specific modifications of the existing cDSP templates are made to adapt to new design requirements, and the second question is answered by developing an ontology-based cDSP template representation method in which a rule-based reasoning mechanism is used for consistency checking. Effectiveness of the ontology-based cDSP representation and reuse is demonstrated for the redesign of a pressure vessel.
- Published
- 2016
- Full Text
- View/download PDF
94. A method for the preliminary design of gears using a reduced number of American Gear Manufacturers Association (AGMA) correction factors
- Author
-
Jitesh H. Panchal, Janet K. Allen, Nagesh Kulkarni, B. P. Gautham, and Farrokh Mistree
- Subjects
Engineering ,Engineering drawing ,Control and Optimization ,business.industry ,Spur gear ,Applied Mathematics ,Association (object-oriented programming) ,05 social sciences ,Empirical correction ,Automotive industry ,02 engineering and technology ,Management Science and Operations Research ,021001 nanoscience & nanotechnology ,Mechanical components ,Industrial and Manufacturing Engineering ,Computer Science Applications ,Reliability engineering ,Robust design ,Factor of safety ,0502 economics and business ,0210 nano-technology ,business ,050203 business & management - Abstract
Typically, the preliminary design of mechanical components such as gears is carried out using standardized design processes such as those developed by the American Gear Manufacturers Association (AGMA). These design standards include a large number of ‘correction factors’ to account for various uncertainties. As the knowledge about these uncertainties increases, it becomes possible to include them systematically in the design procedure, thereby reducing the number of empirical correction factors. Robust design provides a way to design in the presence of various uncertainties. In this article, a design method is proposed to eliminate empirical correction factors and is demonstrated by eliminating two correction factors from the AGMA design standards for a spur gear, namely, the factor of safety in contact and the reliability factor by the formal introduction of uncertainty in the magnitude of load and material properties. The proposed method is illustrated with the design of an automotive gear with...
- Published
- 2016
- Full Text
- View/download PDF
95. Modeling Airlines’ Decisions on City-Pair Route Selection Using Discrete Choice Models
- Author
-
Kushal A. Moolchandani, Zhenghui Sha, Daniel A. DeLaurentis, and Jitesh H. Panchal
- Subjects
Estimation ,050210 logistics & transportation ,020301 aerospace & aeronautics ,Discrete choice ,Operations research ,Aviation ,business.industry ,05 social sciences ,Aerospace Engineering ,Transportation ,Regression analysis ,02 engineering and technology ,Management, Monitoring, Policy and Law ,Supply and demand ,Support vector machine ,Transport engineering ,Identification (information) ,0203 mechanical engineering ,Management of Technology and Innovation ,0502 economics and business ,Economics ,business ,Safety Research ,Selection (genetic algorithm) ,Energy (miscellaneous) - Abstract
An approach based on the discrete choice random-utility theory is presented to model airlines’ decisions of strategically adding or deleting city-pair routes. The approach consists of methods for identification of air transportation networks, determination of choice sets, and comparison and validation of developed discrete choice models. The developed approach enables the quantification and estimation of airlines’ decision-making preferences to the identified explanatory variables, including market demand, direct operating costs, distance, and whether terminal airports are hubs or not. It is observed that market demand more significantly affects the decisions on route deletion than their addition. Furthermore, the effect of direct operating costs is significant in the decision of route deletion, whereas it is not in route addition. Finally, airlines’ decisions vary, depending on the airport hub status. These trends are observed consistently over time in the current analysis of historical data from 2004 to...
- Published
- 2016
- Full Text
- View/download PDF
96. Design Exploration to Determine Process Parameters of Ladle Refining for an Industrial Application
- Author
-
Farrokh Mistree, Jitesh H. Panchal, Ravikiran Anapagaddi, Amarendra K. Singh, Rishabh Shukla, and Janet K. Allen
- Subjects
Schedule ,Ladle ,Computational model ,Engineering drawing ,Decision support system ,Engineering ,Caster ,Process (engineering) ,business.industry ,Metals and Alloys ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,02 engineering and technology ,Condensed Matter Physics ,Physical plant ,Unit operation ,Manufacturing engineering ,020501 mining & metallurgy ,0205 materials engineering ,Materials Chemistry ,Physical and Theoretical Chemistry ,business - Abstract
Due to increasing competition and the demand for light-weight steel, manufacturers strive to produce new grades of steel to meet stringent sets of property specifications. The authors are informed by engineers at a major steel-making plant that for them to determine the set-points for the ladle, tundish, and caster to make a new grade of steel it takes between 20 and 25 physical plant trials. These plant trials are expensive and very difficult to schedule. In this paper, the authors focus on explicating the method for one unit operation in the steel-making process, namely, ladle refining, which is a key operation for maintaining cleanliness and controlling composition; these have a significant bearing on the steel's mechanical properties. The challenge is to determine set points for ladle refining using computational models. In this paper, the authors present a method for visualizing and exploring the solution space using the compromise Decision Support Problem (cDSP). To illustrate the efficacy of the method, the authors determine the set points of the ladle operation in an industrial setting. The focus of the paper is to elucidate the utility of method and not on the analysis models used in the framework.
- Published
- 2016
- Full Text
- View/download PDF
97. An Experimental Study of Human Decisions in Sequential Information Acquisition in Design: Impact of Cost and Task Complexity
- Author
-
Jitesh H. Panchal and Ashish M. Chaudhari
- Subjects
Behavioral experiment ,Process (engineering) ,business.industry ,Computer science ,Closeness ,Bayesian inference ,Machine learning ,computer.software_genre ,Test (assessment) ,Task (project management) ,Information acquisition ,Artificial intelligence ,business ,Decision model ,computer - Abstract
An important type of process-level decisions in design is information acquisition decisions which includes deciding whether to acquire information about a concept, which concepts to test, whether to run simulations or conduct experiments, etc. To improve design processes, it is important to understand how individuals make these decisions under different problem and process settings. Therefore, the objective of this paper is to understand which strategies individuals follow during sequential information acquisition, and how various factors such as cost and task complexity impact their strategies. Towards this objective, a behavioral experiment involving the function optimization task is conducted using student subjects, and Bayesian inference is performed to estimate the closeness of the subjects’ decisions to predictions from different decision models.
- Published
- 2019
- Full Text
- View/download PDF
98. Variation Management in a Single Forward Supply Chain
- Author
-
Janet K. Allen, Farrokh Mistree, Shabnam Rezapour, Ramakrishnan S. Srinivasan, Jeffrey D. Tew, Golnoosh Rasoulifar, Amirhossein Khosrojerdi, and Jitesh H. Panchal
- Subjects
Variation (linguistics) ,Supply chain ,Econometrics ,Mathematics - Published
- 2018
- Full Text
- View/download PDF
99. Architecting Fail-Safe Supply Networks
- Author
-
Shabnam Rezapour, Amirhossein Khosrojerdi, Golnoosh Rasoulifar, Janet K. Allen, Jitesh H. Panchal, Ramakrishnan S. Srinivasan, Jeffrey D. Tew, and Farrokh Mistree
- Published
- 2018
- Full Text
- View/download PDF
100. Emerging Technologies and Extension of the Fail-Safe Framework to Other Networks
- Author
-
Janet K. Allen, Shabnam Rezapour, Farrokh Mistree, Golnoosh Rasoulifar, Ramakrishnan S. Srinivasan, Jeffrey D. Tew, Jitesh H. Panchal, and Amirhossein Khosrojerdi
- Subjects
Extension (metaphysics) ,Risk analysis (engineering) ,Emerging technologies ,Computer science ,Fail-safe - Published
- 2018
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.