38 results on '"John H. Semple"'
Search Results
2. The Exponomial Choice Model: A New Alternative for Assortment and Price Optimization.
- Author
-
Aydin Alptekinoglu and John H. Semple
- Published
- 2016
- Full Text
- View/download PDF
3. Optimal Category Pricing with Endogenous Store Traffic.
- Author
-
Edward J. Fox, Steven Postrel, and John H. Semple
- Published
- 2009
- Full Text
- View/download PDF
4. Note: Generalized Notions of Concavity with an Application to Capacity Management.
- Author
-
John H. Semple
- Published
- 2007
- Full Text
- View/download PDF
5. Optimal Inventory Policy with Two Suppliers.
- Author
-
Edward J. Fox, Richard D. Metters, and John H. Semple
- Published
- 2006
- Full Text
- View/download PDF
6. The Impact of Check Sequencing on NSF (Not-Sufficient Funds) Fees.
- Author
-
Aruna Apte, Uday M. Apte, Randolph P. Beatty, Ila C. Sarkar, and John H. Semple
- Published
- 2004
- Full Text
- View/download PDF
7. Economic justification for incremental implementation of advanced manufacturing systems.
- Author
-
Vahid Lotfi, Joseph Sarkis, and John H. Semple
- Published
- 1998
- Full Text
- View/download PDF
8. Dominant Competitive Factors for evaluating program efficiency in grouped data.
- Author
-
John J. Rousseau and John H. Semple
- Published
- 1997
- Full Text
- View/download PDF
9. An effective non-Archimedean anti-degeneracy/cycling linear programming method especially for data envelopment analysis and like models.
- Author
-
Abraham Charnes, John J. Rousseau, and John H. Semple
- Published
- 1993
- Full Text
- View/download PDF
10. Choosing an n-Pack of Substitutable Products
- Author
-
Laura Norman, Edward J. Fox, and John H. Semple
- Subjects
Consumption (economics) ,Mathematical optimization ,Computer science ,Strategy and Management ,05 social sciences ,Expected value ,Management Science and Operations Research ,050105 experimental psychology ,Dynamic programming ,Microeconomics ,Product (mathematics) ,Bellman equation ,0502 economics and business ,Canonical model ,Economics ,050211 marketing ,0501 psychology and cognitive sciences ,Product (category theory) ,Majorization ,Utility model ,Decision analysis - Abstract
We develop a framework to model the shopping and consumption decisions of forward-looking consumers. Assuming that the consumer’s future utility for each product alternative can be characterized by a standard random utility model, we use dynamic programming to determine the optimal consumption policy and the maximum expected value of consuming any n substitutable products selected while shopping (an n-pack). We propose two models. In the first (canonical) model, we assume that an alternative is consumed on each successive consumption occasion and obtain a closed-form optimal policy and a closed-form value function. Given a consumer’s preferences for the product alternatives in an assortment, we then show how to identify that consumer’s optimal n-pack using a simple swapping algorithm that converges in at most n swaps. In the second (generalized) model, we introduce an outside option so that a product alternative need not be consumed on each consumption occasion. We obtain a closed-form value function for the generalized model and show that its optimal n-pack is related to that of the canonical model using a special type of majorization. Additional structural properties and implications of each model are explored, as are other applications. The online appendix is available at https://doi.org/10.1287/mnsc.2017.2729 . This paper was accepted by Vishal Gaur, operations management.
- Published
- 2018
- Full Text
- View/download PDF
11. Heteroscedastic Exponomial Choice
- Author
-
Aydin Alptekinoglu and John H. Semple
- Subjects
Price elasticity of demand ,Heteroscedasticity ,Discrete choice ,050208 finance ,05 social sciences ,Variance (accounting) ,Management Science and Operations Research ,Economic surplus ,Computer Science Applications ,Oligopoly ,Competition (economics) ,0502 economics and business ,Econometrics ,050211 marketing ,Monopoly ,Mathematics - Abstract
We investigate analytical and empirical properties of the Heteroscedastic Exponomial Choice (HEC) model to lay the groundwork for its use in theoretical and empirical research that build demand models on a discrete choice foundation. The HEC model generalizes the Exponomial Choice (EC) model by including choice-specific variances for the random components of utility (the error terms). We show that the HEC model inherits some of the properties found in the EC model: closed-form choice probabilities, demand elasticities and consumer surplus; optimal monopoly prices that are increasing with ideal utilities in a hockey-stick pattern; and unique equilibrium oligopoly prices that are easily computed using a series of single-variable equations. However, the HEC model has several key differences with the EC model that show variances matter: the choice probabilities (market shares) as well as equilibrium oligopoly prices are not necessarily increasing with ideal utilities; and the new model can include choices with deterministic utility or choices with zero probability. However, because the HEC model uses more parameters, it is harder to estimate. To justify its use, we apply HEC to grocery purchase data for thirty product categories and find that it significantly improves model fit and generally improves out-of-sample prediction compared to EC. We go on to investigate the more nuanced impact of the variance parameters on oligopoly pricing. We find that the individual and collective incentives differ in equilibrium: Firms individually want lower error variability for their own product, but collectively prefer higher error variability for all products – including their own – because higher error variability softens the price competition.
- Published
- 2018
- Full Text
- View/download PDF
12. Dynamic Road Pricing for Revenue Maximization
- Author
-
John H. Semple, Khaled Abdelghany, and Ahmed E. Hassan
- Subjects
biology ,Operations research ,Mechanical Engineering ,Total revenue ,Toll road ,Microeconomics ,Consistency (database systems) ,Toll ,Dynamic pricing ,biology.protein ,Economics ,Revenue ,Road pricing ,Robustness (economics) ,Civil and Structural Engineering - Abstract
A modeling framework and a solution methodology are presented for the toll problem of dynamic revenue maximization. The problem requires determining the optimal time-varying toll prices for a multigantry toll road facility so that total revenue is maximized subject to agency-mandated constraints on average speed and average traffic volume. The framework extends a real-time traffic network state estimation and prediction system to provide dynamic pricing capabilities. The presented framework overcomes limitations of most existing approaches by considering two factors: consistency between the toll value and drivers’ willingness-to-pay behavior and drivers’ route choice dynamics in regard to competition between the toll facility and alternative routes. The paper presents the results of a set of simulation-based experiments that were conducted to examine the robustness of the proposed prices under several operational scenarios.
- Published
- 2013
- Full Text
- View/download PDF
13. Data mining and revenue management methodologies in college admissions
- Author
-
Amit Basu, Surya Rebbapragada, and John H. Semple
- Subjects
Dilemma ,Revenue management ,General Computer Science ,Process (engineering) ,Computer science ,Elite ,Timeline ,Data mining ,Performance indicator ,Decision-making ,computer.software_genre ,Set (psychology) ,computer - Abstract
The competition for college admissions is getting fiercer each year with most colleges receiving record number of applications and hence becoming increasingly selective. The acceptance rate in some elite colleges is as low as 10%, and the uncertainty often causes talented students to apply to schools in the next tier. Students try to improve their chances of getting into a college by applying to multiple schools, with each school having its own timelines and deadlines for admissions. Consequently, students are often caught in a dilemma when they run out of time to accept an offer from a university that is lower on their priority list, before they know the decision from a university that they value more. The college admissions process is thus extremely stressful and unpredictable to both students and their parents. Universities, on the other hand, usually receive far more applications than their capacity. They consider various factors in making their decision, with each university using its own process and timelines. A university typically relies on a weighted set of performance indicators to aid the decision making process. These performance indicators and the associated weights for them are often based on a best guess approach relying mostly on past experience. However, since not all admission offers are accepted, universities send out more offer letters than their capacity and hope that the best students accept their offer. Figure 1 shows a step-by-step sequence of events in a typical university admission process. The sequence describes a scenario that results in an unfavorable outcome for both the university and the student. The student applies to two different universities and prefers one over the other (Step A) with priority 1 university on the far right of the figure. Each university evaluates the application, and priority 2 university makes an early offer along with a certain deadline to accept the offer (Step B). The student, uncertain about priority 1 university, accepts the offer from priority 2 university (Step C), possibly committing some funds. At a later date, priority 1 university decides to accept the student (Step D) who may no longer be available. This process typically spans a number of months and is fraught with uncertainty, and results in a lose-lose situation for the priority 1 university and the student. There are two challenges in the admissions process exemplified above: i. The process of identifying the best applicants involves multiple credentials. Given the complex interactions between these credentials, it is not easy to identify a single model that is effective for this selection process. Furthermore, given the competitive nature of university admissions, there are no normative models in the literature. ii. Once the most desirable candidates are identified, the decision to make an offer, and the composition of that offer, are both difficult. Better candidates are likely to be sought by multiple schools, so the university has to trade off the risks of chasing (and still losing) these students versus the better chances of getting the next tier of students. Furthermore, in many universities, some admission decisions and offer may have to be made before all applications are received. We believe that data mining and revenue management techniques can be used effectively to address both these challenges, and thus convert the lose-lose situation into a win-win situation. By applying these techniques, universities can methodically score an applicant and be able to respond almost immediately with an offer, mitigating prolonged uncertainty while increasing transparency. We demonstrate the approach using a simplistic admissions process. Although individual universities may have additional, and possibly subjective features in their admissions processes, we believe that our approach could be adapted to the specific processes of many universities and colleges.
- Published
- 2010
- Full Text
- View/download PDF
14. A Dynamic Inventory Model with the Right of Refusal
- Author
-
Sreekumar R. Bhaskaran, Karthik Ramachandran, and John H. Semple
- Subjects
inventory-production policies, dynamic programming, stochastic ,Lost sales ,Inventory level ,Order (business) ,Strategy and Management ,Critical threshold ,Econometrics ,Economics ,Production (economics) ,Time horizon ,Limit (mathematics) ,Management Science and Operations Research ,Sales strategy - Abstract
We consider a dynamic inventory (production) model with general convex order (production) costs and excess demand that can be accepted or refused by the firm. Excess demand that is accepted is backlogged and results in a backlog cost whereas demand that is refused results in a lost sales charge. Endogenizing the sales decision is appropriate in the presence of general convex order costs so that the firm is not forced to backlog a unit whose subsequent satisfaction would reduce total profits. In each period, the firm must determine the optimal order and sales strategy. We show that the optimal policy is characterized by an optimal buy-up-to level that increases with the initial inventory level and an order quantity that decreases with the initial inventory level. More importantly, we show the optimal sales strategy is characterized by a critical threshold, a backlog limit, that dictates when to stop selling. This threshold is independent of the initial inventory level and the amount purchased. We investigate various properties of this new policy. As demand stochastically increases, the amount purchased increases but the amount backlogged decreases, reflecting a shift in the way excess demand is managed. We develop two regularity conditions, one that ensures some backlogs are allowed in each period, and another that ensures the amount backlogged is nondecreasing in the length of the planning horizon. We illustrate the buy-up-to levels in our model are bounded above by buy-up-to levels from the pure lost sales and pure backlogging models. We explore additional extensions using numerical experiments.
- Published
- 2010
- Full Text
- View/download PDF
15. Optimal Category Pricing with Endogenous Store Traffic
- Author
-
John H. Semple, Edward J. Fox, and Steven Postrel
- Subjects
Marketing ,Empirical generalization ,Microeconomics ,Dynamic programming ,TheoryofComputation_MISCELLANEOUS ,Geometric series ,Economics ,Current period ,Business and International Management ,marketing, dynamic programming, pricing, optimization, retailing, store traffic ,Profit (economics) - Abstract
We propose a dynamic programming framework for retailers of frequently purchased consumer goods in which the prices affect both the profit per visit in the current period and the number of visitors (i.e., store traffic) in future periods. We show that optimal category prices in the infinite-horizon problem also maximize the closed form sum of a geometric series, allowing us to derive meaningful analytical results. Modeling the linkage between category prices and future store traffic fundamentally changes optimal pricing policy. Optimal pricing must balance current profits against future traffic; under general conditions, optimal long-run prices are uniformly lower across all categories than those that maximize current profits. This result explains the empirical generalization that category demand in grocery stores is inelastic. Parameterizing profit per visit and store traffic reveals that, as future traffic becomes more sensitive to price, retailers should increasingly lower current prices and sacrifice current profits. We also determine how the burden of drawing future traffic to the store should be distributed across categories; this is the foundation for a new taxonomy of category roles.
- Published
- 2009
- Full Text
- View/download PDF
16. A heuristic for multi-item production with seasonal demand
- Author
-
Richard Metters, Michael Ketzenberg, and John H. Semple
- Subjects
Constraint (information theory) ,Dynamic programming ,Mathematical optimization ,Lost sales ,Heuristic ,Economics ,Benchmark (computing) ,Production (economics) ,Heuristics ,Industrial and Manufacturing Engineering ,Multi item - Abstract
A heuristic is developed for a common production/inventory problem characterized by multiple products, stochastic seasonal demand, lost sales, and a constraint on overall production. Heuristics are needed since the calculation of optimal policies is impractical for real-world instances of this problem. The proposed heuristic is compared with those in current use as well as optimal solutions under a variety of conditions. The proposed heuristic is both near optimal and superior to existing heuristics. The heuristic deviated from optimality by an average of 1.7% in testing using dynamic programming as a benchmark. This compares favorably against linear-programming-based heuristics and practitioner heuristics, which deviated from optimality by 4.5 to 10.6%.
- Published
- 2006
- Full Text
- View/download PDF
17. Quality-Based Competition, Profitability, and Variable Costs
- Author
-
John H. Semple, Chester Chambers, and Panos Kouvelis
- Subjects
Public economics ,business.industry ,Strategy and Management ,Management Science and Operations Research ,Variable cost ,Profit (economics) ,Microeconomics ,Benefice ,Competitive behavior ,Economics ,Profitability index ,game theory, operations strategy, quality competition ,business ,Quality assurance ,Game theory ,Duopoly - Abstract
We consider the impact of variable production costs on competitive behavior in a duopoly where manufacturers compete on quality and price in a two-stage game. In the pricing stage, we make no assumptions regarding these costs—other than that they are positive and increasing in quality—and no assumptions about whether or not the market is covered. In the quality stage, we investigate a broad family of variable cost functions and show how the shape of these functions impacts equilibrium product positions, profits, and market coverage. We find that seemingly slight changes to the cost function’s curvature can produce dramatically different equilibrium outcomes, including the degree of quality differentiation, which competitor is more profitable (the one offering higher or lower quality), and the nature of the market itself (covered or uncovered). Our model helps to predict and explain the diversity of outcomes we see in practice—something the previous literature has been unable to do.
- Published
- 2006
- Full Text
- View/download PDF
18. A Comparison of HMO Efficiencies as a Function of Provider Autonomy
- Author
-
Patrick L. Brockett, Charles C. Yang, John J. Rousseau, Ray-E Chang, and John H. Semple
- Subjects
Economics and Econometrics ,Actuarial science ,business.industry ,media_common.quotation_subject ,Control (management) ,Public policy ,Indemnity ,Incentive ,Accounting ,Health care ,Data envelopment analysis ,Managed care ,Business ,Finance ,Autonomy ,media_common - Abstract
Current debates in the insurance and public policy literatures over health care financing and cost control measures continue to focus on managed care and HMOs. The lower utilization rates found in HMOs (compared to tradi- tional fee-for-service indemnity plans) have generally been attributed to the organization's incentive to eliminate all unnecessary medical services. As a consequence HMOs are often considered to be a more efficient arrangement for delivering health care. However, it is important to make a distinction between utilization and efficiency (the ratio of outcomes to resources). Few studies have investigated the effect that HMO arrangements would have on the actual efficiency of health care delivery. Because greater control over provider autonomy appears to be a recurrent theme in the literature on re- form, it is important to investigate the effects these restrictions have already had within the HMO market. In this article, the efficiencies of two major classes of HMO arrangements are compared using "game-theoretic" data envelopment analysis (DEA) models. While other studies confirm that abso- lute costs to insurance firms and sponsoring companies are lowered using HMOs, our empirical findings suggest that, within this framework, efficiency generally becomes worse when provider autonomy is restricted. This should give new fuel to the insurance companies providing fee-for-service (FFS) indemnification plans in their marketplace contentions.
- Published
- 2004
- Full Text
- View/download PDF
19. On the continuity of a Lagrangian multiplier function in input optimization.
- Author
-
John H. Semple and Sanjo Zlobec
- Published
- 1986
- Full Text
- View/download PDF
20. Control and Belbin’s team roles
- Author
-
Stephen Fisher, John H. Semple, and W.D.K. Macrosson
- Subjects
Organizational Behavior and Human Resource Management ,Team Role Inventories ,Role model ,Control (management) ,Industrial and organizational psychology ,Sociology ,Group dynamic ,Social psychology ,Applied Psychology ,Fundamental interpersonal relations orientation - Abstract
Consideration of Belbin’s team role model led to the view that some of the roles proposed might require the exercise of control, but others much less so. A hypothesis which indicated which roles might be expected to manifest expressed and wanted control was developed and then tested using Schutz’s FIRO‐B questionnaire. A mixture of graduates in employment and undergraduates still at university were utilised as subjects for the investigation. After consideration of the validity of Schutz’s constructs, the data obtained were construed as supporting the hypothesis and adding weight to the claims for the validity of the Belbin team role model.
- Published
- 2001
- Full Text
- View/download PDF
21. Managing Inventory with Multiple Products, Lags in Delivery, Resource Constraints, and Lost Sales: A Mathematical Programming Approach
- Author
-
John H. Semple, Brian Downs, and Richard Metters
- Subjects
Inventory control ,Mathematical optimization ,Linear programming ,Strategy and Management ,Resource constraints ,Economics ,Holding cost ,Nonparametric statistics ,Management Science and Operations Research ,Convex function ,Inventory, Heuristic Approximation, Nonparametric Estimation, Separable Convex Programming, Linear Programming ,Profit (economics) ,Separable space - Abstract
This paper develops an order-up-to S inventory model that is designed to handle multiple items, resource constraints, lags in delivery, and lost sales without sacrificing computational simplicity. Mild conditions are shown to ensure that the expected average holding cost and the expected average shortage cost are separable convex functions of the order-up-to levels. We develop nonparametric estimates of these costs and use them in conjunction with linear programming to produce what is termed the “LP policy.” The LP policy has two major advantages over traditional methods: first, it can be computed in complex environments such as the one described above; and second, it does not require an explicit functional form of demand, something that is difficult to specify accurately in practice. In two numerical experiments designed so that optimal policies could be computed, the LP policy fared well, differing from the optimal profit by an average of 2.20% and 1.84%, respectively. These results compare quite favorably with the errors incurred in traditional methods when a correctly specified distribution uses estimated parameters. Our findings support the effectiveness of this mathematical programming technique for approximating complex, real-world inventory control problems.
- Published
- 2001
- Full Text
- View/download PDF
22. Vendor Selection with Bundling: A Comment
- Author
-
Joseph Sarkis and John H. Semple
- Subjects
Integer linear programming model ,Mathematical optimization ,Information Systems and Management ,Computer science ,Vendor ,Strategy and Management ,Workload ,General Business, Management and Accounting ,Vendor Selection ,Purchasing ,Management of Technology and Innovation ,Bundle ,Operations management ,Integer programming ,Integer (computer science) - Abstract
In a recent article by Rosenthal, Zydiak, and Chaudhry (1995), a mixed integer linear programming model was introduced to solve the vendor selection problem for the case in which the vendor can sell items individually or as part of a bundle. Each vendor offered only one type of bundle, and the buyer could purchase at most one bundle per vendor. The model employed n(m+ 1) binary variables, where n is the number of vendors and m is the number of products they sell. The existing model can lead to a purchasing paradox: it may force the buyer to pay more to receive less. We suggest a reformulation of the same problem that (i) eliminates this paradox and reveals a more cost-effective purchasing strategy; (ii) uses only n integer variables and significantly reduces the computational workload; and (iii) permits the buyer to purchase more than one bundle per vendor.
- Published
- 1999
- Full Text
- View/download PDF
23. Optimality conditions and solution procedures for nondegenerate dual-response systems
- Author
-
John H. Semple
- Subjects
Mathematical optimization ,Variables ,media_common.quotation_subject ,Degenerate energy levels ,Stationary point ,Industrial and Manufacturing Engineering ,symbols.namesake ,Quality (physics) ,symbols ,Quadratic programming ,Degeneracy (mathematics) ,Newton's method ,Finite set ,media_common ,Mathematics - Abstract
This paper investigates the dual-response problem in the case where the response functions are nonconvex (nonconcave) quadratics and the independent variables satisfy a radial bound. Sufficient conditions for a global optimum are established and shown to generalize to the multi-response case. It is then demonstrated that the sufficient conditions may not hold if the problem is ‘degenerate’. However, if the problem is nondegenerate, it is shown that the sufficient conditions are necessarily satisfied by some stationary point. In this case, a specialized algorithm (DRSALG) is shown to locate the global optimum in a finite number of steps. DRSALG will also identify the degenerate case and pinpoint the location where degeneracy occurs. The algorithm is easy to implement from the standpoint of code development, and we illustrate our elementary version on a well-studied dual-response example from quality control.
- Published
- 1997
- Full Text
- View/download PDF
24. Constrained games for evaluating organizational performance
- Author
-
John H. Semple
- Subjects
Computer Science::Computer Science and Game Theory ,Mathematical optimization ,Information Systems and Management ,General Computer Science ,Linear programming ,Normal-form game ,ComputingMilieux_PERSONALCOMPUTING ,Context (language use) ,Management Science and Operations Research ,Organizational performance ,Industrial and Manufacturing Engineering ,Core (game theory) ,symbols.namesake ,Nash equilibrium ,Modeling and Simulation ,Data envelopment analysis ,Economics ,symbols ,Mathematical economics ,Game theory - Abstract
This paper develops a class of two-person game theoretic models that provide insight into exceptional aspects of organizational performance. In contrast to earlier games used for performance evaluations, the new model permits constraints on the strategies available to each of the players, a restriction that is frequently imposed in practice. Despite the more complicated nonlinearities involved in these games compared to classical (finite) two-person forms, the players' solutions (Nash equilibria) can still be described as the optimal solutions to a primal-dual linear programming (LP) pair. If the game's data elements are replaced with the observed inputs and outputs obtained from competing organizations, then the resulting LPs resemble those currently used in the core ratio models of Data Envelopment Analysis (DEA). Apart from furthering the intertwinement of game theory and DEA, the constrained games feature additional improvements such as the ability to handle zeros in the data, a well-defined sensitivity analysis, and the unification of previous DEA principles within the context of constrained strategies.
- Published
- 1997
- Full Text
- View/download PDF
25. [Untitled]
- Author
-
John H. Semple and John J. Rousseau
- Subjects
Theoretical computer science ,Linear programming ,Program Efficiency ,Computer science ,Data envelopment analysis ,Econometrics ,General Decision Sciences ,Management Science and Operations Research ,Game theory ,Grouped data - Abstract
Dominant Competitive Factors, unique solutions to a new class of two-person ratio efficiency games, are introduced as a means for distinguishing exceptional aspects of individual performance. The vectors of input-output multipliers thus obtained may be analyzed collectively so that commonalities within groups and differences across groups may be discovered. The method is applied to "Program Follow-Through", the original impetus for developing Data Envelopment Analysis. Our results are compared with those of the earlier study, whereupon substantial new insights are obtained.
- Published
- 1997
- Full Text
- View/download PDF
26. The Exponomial Choice Model
- Author
-
Aydin Alptekinoglu and John H. Semple
- Subjects
Microeconomics ,Oligopoly ,Computer Science::Computer Science and Game Theory ,Discrete choice ,Revenue management ,Willingness to pay ,Mixed logit ,Econometrics ,Economics ,Multinomial distribution ,Price optimization ,Multinomial logistic regression - Abstract
We investigate the use of a canonical version of a discrete choice model due to Daganzo (1979) in optimal pricing and assortment planning. In contrast to multinomial and nested logit (the prevailing choice models used for optimizing prices and assortments), this model assumes a negatively skewed distribution of consumer utilities, an assumption we motivate by conceptual arguments as well as published work. The choice probabilities in this model can be derived in closed-form as an exponomial (a linear function of exponential terms). The pricing and assortment planning insights we obtain from the Exponomial Choice (EC) model differ from the literature in two important ways. First, the EC model allows variable markups in optimal prices that increase with expected utilities. Second, when prices are exogenous, the optimal assortment may exhibit leapfrogging in prices, i.e., a product can be skipped in favor of a lower-priced one depending on the utility positions of neighboring products. These two plausible pricing and assortment patterns are ruled out by multinomial logit (and by nested logit within each nest). We provide structural results on optimal pricing for monopoly and oligopoly cases, and on the optimal assortments for both exogenous and endogenous prices. We also demonstrate how the EC model can be easily estimated---by establishing that the loglikelihood function is concave in model parameters and detailing an estimation example using real data.
- Published
- 2013
- Full Text
- View/download PDF
27. Sensitivity and stability of efficiency classifications in Data Envelopment Analysis
- Author
-
John J. Rousseau, John H. Semple, and Abraham Charnes
- Subjects
Economics and Econometrics ,Mathematical optimization ,Linear programming ,Data envelopment analysis ,Applied mathematics ,Sensitivity (control systems) ,Radius ,Business and International Management ,Stability (probability) ,Measure (mathematics) ,Social Sciences (miscellaneous) ,Mathematics - Abstract
A new technique for assessing the sensitivity and stability of efficiency classifications in Data Envelopment Analysis (DEA) is presented. Here developed for the ratio (CCR) model, this technique extends easily to other DEA variants. An organization's input-outut vector serves as the center for a cell within which the organization's classification remains unchanged under perturbations of the data. For the l 1, l ∞ and generalized l ∞ norms, the radius of the maximal cell can be computed using linear programming formulations. This radius can be interpreted as a measure of the classification's stability, especially with respect to errors in the data.
- Published
- 1996
- Full Text
- View/download PDF
28. Radii of Classification Preservation in Data Envelopment Analysis: a Case Study of ‘Program Follow-Through’
- Author
-
John H. Semple and John J. Rousseau
- Subjects
Marketing ,Linear programming ,Strategy and Management ,Efficient frontier ,Management Science and Operations Research ,Group efficiency ,Residual ,Management Information Systems ,Robustness (computer science) ,Norm (mathematics) ,Statistics ,Econometrics ,Data envelopment analysis ,Additive model ,Mathematics - Abstract
Sensitivity and robustness of efficiency classifications for the additive model and its geometric equivalents in Data Envelopment Analysis (DEA) are addressed. The minimum distance (measured by a Tchebycheff norm) separating an organization from reclassification is computed via linear programming formulations and shown to constitute a generalized ‘residual’ for each organization. Without this sensitivity information, findings can be distorted when marginally efficient or inefficient units are distinguished solely on the basis of their classification. Analysis of these residuals from an earlier (inconclusive) DEA study further reveals how substantive differences in a sample's underlying groups can be detected. Properties of group efficiency and group proximity to the efficient frontier are investigated using these new indicators.
- Published
- 1995
- Full Text
- View/download PDF
29. Two-Person Ratio Efficiency Games
- Author
-
John J. Rousseau and John H. Semple
- Subjects
Class (computer programming) ,Computer Science::Computer Science and Game Theory ,Current (mathematics) ,Linear programming ,Strategy and Management ,Stochastic game ,Context (language use) ,Management Science and Operations Research ,Constraint (information theory) ,Data envelopment analysis ,game theory, data envelopment analysis, linear programming, efficiency ,Game theory ,Mathematical economics ,Mathematics - Abstract
This paper demonstrates that a class of two-person games with ratio payoff functions can be solved using equivalent primal-dual linear programming formulations. The game’s solution contains specialized information which may be used to conduct the efficiency evaluation currently done by the CCR ratio model of Data Envelopment Analysis (DEA). Consequently a rigorous connection between DEA’s CCR model and the theory of games is established. Interpretations of these new solutions are discussed in the context of current ongoing applications.
- Published
- 1995
- Full Text
- View/download PDF
30. An effective non-Archimedean anti-degeneracy/cycling linear programming method especially for data envelopment analysis and like models
- Author
-
John J. Rousseau, Abraham Charnes, and John H. Semple
- Subjects
Mathematical optimization ,Linear programming ,Computer science ,Data envelopment analysis ,General Decision Sciences ,Management Science and Operations Research ,Degeneracy (mathematics) - Abstract
A non-Archimedean effective anti-degeneracy/cycling method for linear programming models, especially Data Envelopment Analysis (DEA), processing networks and advertising media mix models is herein developed. It has given a tenfold speed increase plus elimination of cycling difficulties over conventional Marsden or Kennington/Ali LP software modules in a 1000 LP DEA application.
- Published
- 1993
- Full Text
- View/download PDF
31. Non-archimedean infinitesimals, transcendentals and categorical inputs in linear programming and data envelopment analysis
- Author
-
Abraham Charnes, John H. Semple, and John J. Rousseau
- Subjects
Mathematical optimization ,Simplex algorithm ,Linear programming ,Control and Systems Engineering ,Infinitesimal ,Data envelopment analysis ,Base field ,Extreme point ,Categorical variable ,Computer Science Applications ,Theoretical Computer Science ,Mathematics ,Transcendentals - Abstract
Two problems in linear programming associated with data envelopment analysis (DEA) namely, employing non-archimedean infinitesimals, transcendemals (‘big Ms’) and categorical variables (in a new non-archimedean formulation) are addressed. A new more sophisticated pricing procedure as part of an adjacent extreme point algorithm solves these efficiently in the base field. Employing this in Charnes's non-archimedean simplex algorithm led to a ninefold increase in computational speed on large DEA problems with about 1000 decision-making units. Additionally, computational failures due to cycling and/or conditioning instabilities were eliminated.
- Published
- 1992
- Full Text
- View/download PDF
32. Semi-infinite relaxation of joint chance constraints in chance-constrained programming Part 1. Zero-order stochastic decision rules†
- Author
-
Y. C. Chang, Abraham Charnes, and John H. Semple
- Subjects
Constraint (information theory) ,Zero order ,Mathematical optimization ,Linear inequality ,Linear programming ,Semi-infinite ,Control and Systems Engineering ,Decision rule ,Relaxation (approximation) ,Joint (geology) ,Computer Science Applications ,Theoretical Computer Science ,Mathematics - Abstract
A new class of semi-infinite deterministic (‘determinizations’) dominants and relaxations of joint chance-constraints in chance-constrained programming is developed and specialized to zero-order stochastic decision rule situations. Tight constraint relaxations are obtained where only the partial information of means and variances is known. The tight non-linear semi-infinite relaxations are related to an accessible finite subsystem. When the chance constraints involve linear inequalities, for a large class, the non-linear tight system is proved equivalent to a linear program. Its solutions for the Prekopa-Szantai reservoir construction examples agree well with theirs
- Published
- 1992
- Full Text
- View/download PDF
33. Sensitivity of efficiency classifications in the additive model of data envelopment analysis
- Author
-
John H. Semple, Patrick V. Jaska, Stephen Haag, and Abraham Charnes
- Subjects
Mathematical optimization ,Linear programming ,Decision theory ,Stability (learning theory) ,Contrast (statistics) ,Computer Science Applications ,Theoretical Computer Science ,Matrix (mathematics) ,Control and Systems Engineering ,Econometrics ,Data envelopment analysis ,Sensitivity (control systems) ,Additive model ,Mathematics - Abstract
In contrast to existing sufficient conditions for preservation of efficiency under special perturbations and matrix structural assumptions, sensitivity of the additive model's classifications in data envelopment analysis (DEA) is investigated by means of new DEA formulations focusing on the stability (sensitivity) of an organization's classification (whether efficient or inefficient). The formulations for the additive model are linear programming problems whose solutions yield a particular region of stability, a ‘cell’, in which an organization's classification remains unchanged. The largest such cell can always be easily computed for each organization and additionally theoretically characterized simply as optimal solutions of particular linear programming problems.
- Published
- 1992
- Full Text
- View/download PDF
34. Assessing the relative efficiency of agricultural production units in the Blackland Prairie, Texas
- Author
-
Patrick V. Jaska, Stephen Haag, and John H. Semple
- Subjects
Economics and Econometrics ,Efficiency ,Agriculture ,business.industry ,Data envelopment analysis ,Economics ,Agricultural productivity ,business ,Additive model ,Agricultural economics - Abstract
The way in which data envelopment analysis (DEA) can be used to assess the relative technical efficiency of agricultural production units is illustrated. Two variants of the existing DEA additive model introduced, each eliminating earlier difficults encountered inapplication and theory. These new models are demonstrated through application to agricultural data obtained from the Blackland Prairie, Texas.
- Published
- 1992
- Full Text
- View/download PDF
35. A General Approximation to the Distribution of Count Data with Applications to Inventory Modeling
- Author
-
Bezalel Gavish, John H. Semple, and Edward J. Fox
- Subjects
Inverse Gaussian distribution ,Empirical data ,symbols.namesake ,Mathematical optimization ,Distribution (mathematics) ,Simple (abstract algebra) ,Stockout ,Log-normal distribution ,symbols ,Random variable ,Mathematics ,Count data - Abstract
We derive a general approximation to the distribution of count data based on the first two moments of the underlying interarrival distribution. The result is a variant of the Birnbaum-Saunders (BISA) distribution. This distribution behaves like the lognormal in several respects; however, we show that the BISA can fit both simulated and empirical data better than the lognormal and that the BISA possesses additive properties that the lognormal does not. This results in computational advantages for operational models that involve summing random variables. Moreover, although the BISA can be fit to count data (as we demonstrate empirically), it can also be fit directly to transaction-level interarrival data. This provides a simple, practical way to sidestep distributional fitting problems that arise from count data that is censored by inventory stockouts. In numerical experiments involving dynamic inventory models, we compare the BISA distribution to other commonly used distributions and show how it leads to better managerial decisions.
- Published
- 2008
- Full Text
- View/download PDF
36. Notes: Categorical Outputs in Data Envelopment Analysis
- Author
-
John J. Rousseau and John H. Semple
- Subjects
Algebra ,data envelopment analysis, efficiency, categorical outputs, linear programming ,Linear programming ,Strategy and Management ,Computation ,Data envelopment analysis ,Linear programming formulation ,Management Science and Operations Research ,Categorical variable ,Algorithm ,Integer (computer science) ,Mathematics ,Interpretation (model theory) - Abstract
A new linear programming formulation for handling categorical outputs in DEA is presented which eliminates the difficulties of interpretation and computation that accompanied earlier mixed integer models.
- Published
- 1993
- Full Text
- View/download PDF
37. On a necessary condition for stability in perturbed linear and convex programming
- Author
-
John H. Semple and Sanjo Zlobec
- Subjects
Convex analysis ,General Mathematics ,Mathematical analysis ,Convex optimization ,Linear matrix inequality ,Convex set ,Proper convex function ,Second-order cone programming ,Convex combination ,Subderivative ,Management Science and Operations Research ,Software ,Mathematics - Abstract
We show that a necessary condition for stable perturbations in linear and convex programming is valid on an arbitrary region of stability. Using point-to-set mappings, two new regions of stability are identified.
- Published
- 1987
- Full Text
- View/download PDF
38. The computation of global optima in dual response systems
- Author
-
Shu Kai Fan, John H. Semple, and Enrique Castillo
- Subjects
Mathematical optimization ,021103 operations research ,Fortran ,Computer science ,Strategy and Management ,Computation ,0211 other engineering and technologies ,02 engineering and technology ,Management Science and Operations Research ,01 natural sciences ,Dual response ,Industrial and Manufacturing Engineering ,010104 statistics & probability ,Region of interest ,Computer Science::Mathematical Software ,Response surface methodology ,0101 mathematics ,Safety, Risk, Reliability and Quality ,computer ,Global optimization ,computer.programming_language - Abstract
This paper presents an ANSI FORTRAN implementation of a new algorithm for the global optimization of dual response systems within a spherical region of interest. The algorithm, DRSALG, is a new computational method that guarantees, under conditions that..
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.