92 results on '"Gerald G. Brown"'
Search Results
52. Analyzing the Vulnerability of Critical Infrastructure to Attack and Planning Defenses
- Author
-
Gerald G. Brown, W. Matthew Carlyle, Javier Salmerón, and Kevin Wood
- Published
- 2005
53. Optimization-Based Military Capital Planning
- Author
-
Alexandra M. Newman, Robert F. Dell, and Gerald G. Brown
- Subjects
Finance ,Engineering ,business.industry ,Military science ,Domestic technology ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Military threat ,Management ,Capital budgeting ,Procurement ,Military theory ,Capital (economics) ,Military tactics ,business - Abstract
The United States military carefully plans and justifies its materiel procurements. Theses decisions have a profound, long-term impact on our ability to defend our nation, and to fight and win our nation's wars. Annual U.S. materiel investments is now larger than that of the rest of the world combined, and attracts keen attention from political leaders and government contractors. Procurement plans are complicated by their influence on domestic technology and production abilities, conflicted objectives, concerns regarding interoperability and maintainability of the materiel, and the sheer scale of the endeavor. Mathematical optimization models have long played a key role in unraveling the complexities of capital planning, and the military has lead the development and use of such models. We survey the history of optimizing civilian and military capital plans and then present prototypic models exhibiting features that render these models useful for real-world decision support.
- Published
- 2004
54. The Fast Theater Model (FATHM)
- Author
-
Alan R. Washburn and Gerald G. Brown
- Published
- 2002
55. Valid Integer Polytope (VIP) Penalties for Branch-and-Bound Enumeration
- Author
-
Gerald G. Brown, Robert F. Dell, Michael P. Olson, and Operations Research (OR)
- Subjects
Discrete mathematics ,Branch and bound ,Linear programming ,Applied Mathematics ,Polytope ,Management Science and Operations Research ,Upper and lower bounds ,Industrial and Manufacturing Engineering ,Combinatorics ,Integer-linear programming ,Penalties ,Enumeration ,Integer programming ,Software ,Mathematics - Abstract
Operations Research Letters, 26, pp. 117-126. We introduce new penalties, called valid integer polytope (VIP) penalties, that tighten the bound of an integer-linear program during branch-and-bound enumeration. Early commercial codes for branch and bound commonly employed penalties developed from the dual simplicial lower bound on the cost of restricting fractional integer variables to proximate integral values. VIP penalties extend and tighten these ubiquitous k-pack, k-partition, and k-cover constraints. In real-world problems, VIP penalties occasionally tighten the bound by more than an order of magnitude, but they usually offer small bound improvement. Their ease of implementation, speed of execution, and occasional, overwhelming success make them an attractive addition during branch-and-bound enumeration.
- Published
- 2000
56. Optimally Reorganizing Navy Shore Infrastructure
- Author
-
Robert F. Dell, Gerald G. Brown, and Mitchell C. Kerman
- Subjects
Shore ,Finance ,geography ,Engineering ,geography.geographical_feature_category ,Operations research ,Cost effectiveness ,business.industry ,Outsourcing ,Military personnel ,Navy ,Cold war ,Technical report ,Resource management ,business - Abstract
The end of the cold war has allowed the United States to significantly reduce defense spending. Spending has been reduced for both the force structure (i.e., equipment and manpower) and the military support base (i.e., infrastructure), but infrastructure reductions continue to lag force structure reductions. The United States Navy's recent initiatives to reduce its shore infrastructure costs include "regionalization", "outsourcing," and "homebasing." While regionalization and outsourcing decrease the number of jobs needed on a shore installation, homebasing generally increases the number of available personnel. These opposing effects require careful implementation. This thesis develops the Regionalization and Outsourcing Optimization Model (ROOM), an integer linear program that identifies an optimal combination of regionalization and outsourcing options for a Navy shore installation with personnel altered by homebasing. A ROOM test case uses actual data from the Pearl Harbor Naval Installation with proposed homebasing and regionalization and outsourcing options for 109 "functions," or shore installation activities. Disregarding homebasing and its opposing effects, regionalization is the lowest cost option for 106 of these functions. ROOM's optimal solution, however, recommends regionalizing only 21 functions, outsourcing 14, and leaving 74 unchanged. This solution yields a first-year savings of $9.5 million.
- Published
- 1998
57. Consolidation of Customer Orders Into Truckloads at a Large Manufacturer
- Author
-
Gerald G. Brown, David Ronen, and Operations Research (OR)
- Subjects
Marketing ,Strategic planning ,Truck ,Operations research ,business.industry ,logistics ,optimisation ,Strategy and Management ,Information technology ,Management Science and Operations Research ,practice of OR ,Automation ,Purchasing ,Management Information Systems ,Consolidation (business) ,Information system ,Project management ,business - Abstract
Journal of the Operational Research Society, 48, pp. 779-785. We describe the development and operation of an interactive system based on a mathematical optimization model which is used by a major US manufacturer to consolidate customer orders into truckloads. Dozens of users employ the system daily for planning delivery of orders from manufacturing plants to customers by truckload carriers, saving numerous hours of the users' time and reducing transportation costs.
- Published
- 1997
58. Optimizing Submarine Berthing with a Persistence Incentive
- Author
-
Kelly J. Cormican, Gerald G. Brown, Siriphong Lawphongpanich, Daniel B. Widdis, and Operations Research (OR)
- Subjects
Cost reduction ,Navy ,Corrective maintenance ,Berth allocation problem ,Modeling and Simulation ,Crew ,Submarine ,Ocean Engineering ,Operations management ,Business ,Management Science and Operations Research ,Training (civil) ,Port (computer networking) - Abstract
Submarine berthing plans reserve mooring locations for inbound U.S. Navy nuclear submarines prior to their port entrance. Once in port, submarines may be shifted to different berthing locations to allow them to better receive services they require or to make way for other shifted vessels. However, submarine berth shifting is expensive, labor inten- sive, and potentially hazardous. This article presents an optimization model for submarine berth planning and demonstrates it with Naval Submarine Base, San Diego. After a berthing plan has been approved and published, changed requests for services, delays, and early arrival of inbound submarines are routine events, requiring frequent revisions. To encourage trust in the planning process, the effect on the solution of revisions in the input is kept small by incorporating a persistence incentive in the optimization model. q 1997 John Wiley & Sons, Inc. Naval Research Logistics 44: 301 - 318, 1997. Although the Cold War has ended, United States Navy submarines remain very capable and effective ships of war: A smaller number of submarines operated from fewer submarine bases will continue to play a significant role in national defense. The wise use of time and resources while submarines are in port will improve the state of readiness of a smaller fleet. While in port, a submarine completes preventive and corrective maintenance, replenishes stores, and conducts training and certification tests to maintain high material and personnel readiness. Ideally, a submarine in port should devote its time exclusively to these activities. However, submarines frequently spend time shifting berths. Some shifts are necessary and some are not. Services such as ordnance loading and the use of special maintenance equip- ment require that a submarine be moored at a specific location. During periodic maintenance upkeep, personnel from a submarine tender assist the submarine crew, and berthing near the tender is preferable. During training, inspection, and other periods, it is desirable to berth closer to shore, near squadron offices and training facilities. When conditions permit
- Published
- 1997
59. Optimizing Ship Berthing
- Author
-
Siriphong Lawphongpanich, Katie Podolak Thurman, Gerald G. Brown, and Operations Research (OR)
- Subjects
Shore ,geography ,geography.geographical_feature_category ,Operations research ,Computer science ,Ocean Engineering ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Certification ,Management Science and Operations Research ,Training (civil) ,Port (computer networking) ,Scheduling (computing) ,Navy ,Berth allocation problem ,Modeling and Simulation - Abstract
Naval Research Logistics, 41, pp. 1-15. Ship berthing plans reserve a location for inbound U.S. Navy surface vessels prior to their port entrance, or reassign ships once in port to allow them to complete, in a timely manner, re-provisioning, repair, maintenance, training, and certification tests prior to redeploying for future operational commitments. Each ship requires different services when in port, such as shore power, craine, ordnance, and fuel. Unfortunately, not all services are offered at all piers, and berth shifting is disruptive and expensive. A port operations schedule strives to reduce unnecessary berth shifts. We present an optimization model for berth planning and demonstrate it for Norfolk Naval Station, which exhibits all the richness of berthing problems the Navy faces.
- Published
- 1994
60. A tribute
- Author
-
Gerald G. Brown
- Subjects
Modeling and Simulation ,Ocean Engineering ,Management Science and Operations Research - Published
- 2011
61. An Optimization Model for Modernizing the Army's Helicopter Fleet
- Author
-
Gerald G. Brown, R K Wood, Robert D Clemence, William R Teufert, and Operations Research (OR)
- Subjects
Engineering ,Decision support system ,biology ,Operations research ,Programming - Integer applications ,business.industry ,Strategy and Management ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Time horizon ,Management Science and Operations Research ,Finance - capital budgeting ,biology.organism_classification ,Modernization theory ,GeneralLiterature_MISCELLANEOUS ,Procurement ,Management of Technology and Innovation ,Military - cost effectiveness ,Phoenix ,business - Abstract
Interfaces, 21, pp. 39-52. The helicopter has grown in military stature for more than 40 years: its ascendancy has reformed the US Army. Unfortunately, the current army helicopter fleet consists predominantly of Vietnam-era aircraft approaching the end of their useful lives. We have captured complex procurement and modernization tasks in optimization based decision support system, christened PHOENIX, which recognizes yearly operating, maintenance, retirement, service-life extention, and new procurement costs while enforcing constraints on fleet age, technology mix, composition, and budgets over a multi-year planning horizon. The army has applied PHOENIX to helicopters with such success that it has already been adapted to tactical wheeled vehicles and is under consideration for further applications.
- Published
- 1991
62. Call for Nominations
- Author
-
Gerald G. Brown
- Subjects
Strategy and Management ,Political science ,Editor in chief ,Management Science and Operations Research ,Management - Published
- 2008
63. Scientists Urge DHS to Improve Bioterrorism Risk Assessment.
- Author
-
Gregory S. Parnell, Luciana L. Borio, Gerald G. Brown, David Banks, and Alyson G. Wilson
- Abstract
In 2006, the Department of Homeland Security (DHS) completed its first Bioterrorism Risk Assessment (BTRA), intended to be the foundation for DHS's subsequent biennial risk assessments mandated by Homeland Security Presidential Directive 10 (HSPD-10). At the request of DHS, the National Research Council established the Committee on Methodological Improvements to the Department of Homeland Security's Biological Agent Risk Analysis to provide an independent, scientific peer review of the BTRA. The Committee found a number of shortcomings in the BTRA, including a failure to consider terrorists as intelligent adversaries in their models, unnecessary complexity in threat and consequence modeling and simulations, and a lack of focus on risk management. The Committee unanimously concluded that an improved BTRA is needed to provide a more credible foundation for risk-informed decision making. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
64. Solving Generalized Networks
- Author
-
Richard D. McBride, Gerald G. Brown, and Operations Research (OR)
- Subjects
flow algorithms, networks/graphs: generalized [networks/graphs] ,Mathematical optimization ,Simplex ,Linear programming ,Mathematical model ,Computer science ,Strategy and Management ,Graph theory ,Transportation theory ,Management Science and Operations Research ,Flow network ,Generalized assignment problem - Abstract
Management Science, 30, 12, pp. 1497-1523. (1984 Lanchester Prize Finalist.) A complete, unified description is given of the design, implementation and use of a family of very fast and efficient large-scale minimum-cost (primal simplex) network programs. The class of capacitated generalized transshipment problems solved includes the capacitated and uncapacitated generalized transportation problems and the continuous generalized assignment problem, as well as the pure network flow models which are specialized of these problems. These formulations are used for a large number of diverse applications to determine how (or at what rate) flows through the areas of a network can minimize total shipment costs. A generalized network problem can also be viewed as a linear program with at most two nonzero entries in each column of the constraint matrix; this property is exploited in the mathematical presentation with special emphasis on data structures for basis representation, basis manipulation, and pricing mechanisms. A literature review accompanies computational testing of promising ideas, and extensive experimentation is reported which has produced GENNET, an extremely efficient family of generalized network systems.
- Published
- 1984
65. An examination of the effects of the criterion functional on optimal fire-support policies
- Author
-
James G. Taylor and Gerald G. Brown
- Subjects
Mathematical optimization ,Local optimum ,Optimization problem ,Ranking ,Operations research ,Computer science ,Stochastic game ,Infantry ,General Engineering ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Fire support ,Adversary ,Optimal control - Abstract
This paper examines the dependence of the structure of optimal time- sequential fire support policies on the quantification of military objectives by considering four specific problems, each corresponding to a different quantification of objectives (i.e. criterion functional). The authors consider the optimal time-sequential allocation of supporting fires during the 'approach to contact' of friendly infantry against enemy defensive positions. The combat dynamics are modelled by deterministic Lanchester-type equations of warfare, and the optimal fire-support policy for each one-sided combat optimization problem is developed via optimal control theory. The problems are all nonconvex, and local optima are a particular difficulty in one of them. For the same dynamics, the splitting of supporting fires between two enemy forces in an optimal policy (i.e. the optimality of singular subarcs) is shown to depend only on whether the terminal payoff reflects the objective of attaining an 'overall' military advantage or a 'local' one. Additionally, switching times for changes in the ranking of target priorities are shown to be different when the decision criterion is the difference and the ratio of the military worths of total infantry survivors and also the difference and the ratio of the military worths of the combatants' total infantry losses.
- Published
- 1978
66. Tables for Determining Expected Cost Per Unit under MIL-STD-105D Single Sampling Schemes
- Author
-
Gerald G. Brown, Herbert C. Rutemiller, and Operations Research (OR)
- Subjects
Engineering ,Acceptance sampling ,Sample size determination ,business.industry ,Statistics ,Sampling (statistics) ,Fraction (mathematics) ,Lot quality assurance sampling ,Expected value ,business ,Quality assurance ,Industrial and Manufacturing Engineering ,Term (time) - Abstract
AIIE Transactions, 6, pp. 135-142. When a MIL-STD-IOSD sampling scheme is used for a long period, some lots will be subjected to normal, some to reduced, and some to tightened inspection. This paper provides for several single sampling plans and various quality levels,'the expected fraction of lots rejected,the expected sample size per lot, and the expected number of lots to be processed before sampling inspection must be discontinued. Eq.uations are given to calculate the long term cost of sampling inspection using these expected values and appropriate cost parameters.
- Published
- 1974
67. Annihilation Prediction for Lanchester-Type Models of Modern Warfare
- Author
-
Gerald G. Brown and James G. Taylor
- Subjects
Mathematical optimization ,Annihilation ,Battle ,Offset (computer science) ,media_common.quotation_subject ,Management Science and Operations Research ,Modern warfare ,Computer Science Applications ,Nonlinear Sciences::Adaptation and Self-Organizing Systems ,Homogeneous ,Analytic solution ,Analytic function ,media_common ,Mathematics - Abstract
This paper introduces important new functions for analytic solution of Lanchester-type equations of modem warfare for combat between two homogeneous forces modeled by power attrition-rate coefficients with “no offset.” Tabulations of these Lanchester-Clifford-Schläfli (or LCS) functions allow one to study this particular variable-coefficient model almost as easily and thoroughly as Lanchester's classic constant-coefficient one. LCS functions allow one to obtain important information (in particular, force-annihilation prediction) without having to spend the time and effort of computing force-level trajectories. The choice of these particular functions is based on theoretical considerations that apply in general to Lanchester-type equations of modern warfare and provide guidance for developing other canonical functions. Moreover, our new LCS functions also provide valuable information about related variable-coefficient models. Also, we introduce an important transformation of the battle's time scale that not only simplifies the force-level equations, but also shows that relative fire effectiveness and intensity of combat are the only two weapon-system parameters determining the course of such variable-coefficient Lanchester-type combat.
- Published
- 1983
68. Exceptional Paper--Design and Implementation of Large Scale Primal Transshipment Algorithms
- Author
-
Glenn W. Graves, Gordon H. Bradley, and Gerald G. Brown
- Subjects
Mathematical optimization ,Linear programming ,Computer science ,Strategy and Management ,Minimum-cost flow problem ,Management Science and Operations Research ,Representation (mathematics) ,Flow network ,Assignment problem ,Algorithm ,Multi-commodity flow problem ,Transshipment - Abstract
A complete description is given of the design, implementation and use of a family of very fast and efficient large scale minimum cost (primal simplex) network programs. The class of capacitated transshipment problems solved is the most general of the minimum cost network flow models which include the capacitated and uncapacitated transportation problems and the classical assignment problem; these formulations are used for a large number of diverse applications to determine how (or at what rate) a good should flow through the arcs of a network to minimize total shipment costs. The presentation tailors the unified mathematical framework of linear programming to networks with special emphasis on data structures which are not only useful for basis representation, basis manipulation, and pricing mechanisms, but which also seem to be fundamental in general mathematical programming. A review of pertinent optimization literature accompanies computational testing of the most promising ideas. Tuning experiments for the network system, GNET, are reported along with important extensions such as exploitation of special problem structure, element generation techniques, postoptimality analysis, operation with problem generators and external problem files, and a simple noncycling pivot selection procedure which guarantees finiteness for the algorithm.
- Published
- 1977
- Full Text
- View/download PDF
69. Means and Variances of Stochastic Vector Products with Applications to Random Linear Models
- Author
-
Gerald G. Brown, Herbert C. Rutemiller, and Operations Research (OR)
- Subjects
Mathematical optimization ,Strategy and Management ,Linear predictor function ,Deterministic simulation ,Random function ,Randomness tests ,Management Science and Operations Research ,Random variable ,Stochastic programming ,Randomness ,Linear-fractional programming ,Mathematics - Abstract
Management Science, 24, 2, pp. 210-216. Applications in operations research often employ models which contain linear functions. These linear functions may have some components (coefficients and variables) which are random. (For instance, linear functions in mathematical programming often represent models of processes which exhibit randomness in resource availability, consumption rates, and activity levels.) Even when the linearity assumptions of these models is unquestioned, the effects of the randomness in the functions is of concern. Methods to accomodate, or at least estimate for a linear function the implications of randomness in its components typically make several simplifying assumptions. Unfortunately, when components are known to be random in a general, multivariate dependent fashion, concise specification of the randomness exhibited by the linear function is, at best, extremely complicated, usually requiring severe, unrealistic restrictions on the density functions of the random components. Frequent stipula- tions include assertion of normality, or of independence-yet, observed data, accepted collat- eral theory and common sense may dictate that a symmetric distribution with infinite domain limits is inappropriate, or that a dependent structure is definitely present. (For example, random resource levels may be highly correlated due to economic conditions, and non- negative for physical reasons.) Often, an investigation is performed by discretizing the random components at point quantile levels, or by replacing the random components by their means-methods which give a deterministic "equivalent" model with constant terms, but possibly very misleading results. Outright simulation can be used, but requires considerable time investment for setup and debugging (especially for generation of dependent sequences of pseudorandom variates) and gives results with high parametric specificity and computation cost. This paper shows how to use elementary methods to estimate the mean and variance of a linear function with arbitrary multivariate randomness in its components. Expressions are given for the mean and variance and are used to make Tchebycheff-type probability state- ments which can accomodate and exploit stochastic dependence. Simple estimation examples are given which lead to illustrative applications with (dependent-) stochastic programming models.
- Published
- 1977
70. Automatic Identification of Generalized Upper Bounds in Large-Scale Optimization Models
- Author
-
Gerald G. Brown and David S. Thomen
- Subjects
Mathematical optimization ,Polynomial ,Optimization problem ,Linear programming ,Heuristic (computer science) ,Strategy and Management ,Management Science and Operations Research ,Upper and lower bounds ,Simplex algorithm ,programming, large-scale systems, generalized upper bounds ,Algorithm ,Integer programming ,Mathematics ,Integer (computer science) - Abstract
To solve contemporary large-scale linear, integer and mixed integer programming problems, it is often necessary to exploit intrinsic special structure in the model at hand. One commonly used technique is to identify and then to exploit in a basis factorization algorithm a generalized upper bound (GUB) structure. This report compares several existing methods for identifying GUB structure. Computer programs have been written to permit comparison of computational efficiency. The GUB programs have been incorporated in an existing optimization system of advanced design and have been tested on a variety of large-scale real-life optimization problems. The identification of GUB sets of maximum size is shown to be among the class of NP-complete problems; these problems are widely conjectured to be intractable in that no polynomial-time algorithm has been demonstrated for solving them. All the methods discussed in this report are polynomial-time heuristic algorithms that attempt to find, but do not guarantee, GUB sets of maximum size. Bounds for the maximum size of GUB sets are developed in order to evaluate the effectiveness of the heuristic algorithms.
- Published
- 1980
- Full Text
- View/download PDF
71. Estimation of the probability of labor force participation of the AFDC population-at-risk
- Author
-
Gerald G. Brown, Herbert C. Rutemiller, and Eric J. Solberg
- Subjects
Estimation ,education.field_of_study ,Iterative method ,Work (physics) ,Population ,Interval estimation ,Asymptotic distribution ,Sigmoid function ,Standard deviation ,Statistics ,Econometrics ,Economics ,education ,General Economics, Econometrics and Finance - Abstract
In summary, the functional form makes quite a difference. An investigator should be quite wary of making generalizations based on any single specification or estimation technique. However, the above results have shown in striking fashion the superiority of MLE of the sigmoid specifications over the OLS estimation of the linear probability specification. Although the logistic or urban specification require iterative solution, this is no barrier on a modern digital computer, with appropriate special algorithms. A further advantage of the MLE is the asymptotic normality of the estimates of ϑ i which permits large sample interval estimation, and the iteration method of scoring employed yields directly an estimate of the standard deviation of each normally distributed ϑ i . Also standard tests of significance are now applicable.
- Published
- 1977
72. Design and Operation of a Multicommodity Production/Distribution System Using Primal Goal Decomposition
- Author
-
Maria D. Honczarenko, Gerald G. Brown, Glenn W. Graves, and Operations Research (OR)
- Subjects
Cost reduction ,Mathematical optimization ,Decision support system ,Computer science ,Strategy and Management ,Production control ,Scheduling (production processes) ,large-scale systems, programming: integer, applications, facilities/equipment planning [programming] ,Decomposition method (constraint satisfaction) ,Management Science and Operations Research ,Integer programming - Abstract
Management Science, 33, p. 1469. (Nominated for 1987 International Management Science Achievement Award.) This paper was originally presented at ORSA/TIMS, Houston, October 14, 1981, under the title "Large-Scale Facility and Equipment Location: An Application of Goal Programming in Multicommodity Decomposition" The article of record as published may be found at https://doi.org/10.1287/mnsc.33.11.1469 An optimization-based decision support system has been developed and used by NABISCO to manage complex problems involving facility selection, equipment location and utilization, and manufacture and distribution of products such as the familiar Ritz Crackers, Oreo Cookies, Fig Newtons, etc. (all product names trademarks of NABISCO). A mixed-integer, multi-integer, multi-commodity model is presented for the problems at hand, and a new class of goal decompositions is introduced to yield pure network subproblems for each commodity; the associated master problems have several notable properties which contribute to the effectiveness of the algorithm. Excellent quality solutions for problems with more than 40,000 variables (including several hundred binary variables with fixed charges) and in excess of 20,000 constraints require only 0.6 megabytes region and less than one compute minute on a time-shared IBM 3033 computer; average problems (with fewer binary variables) require only a second or two. The solution method has more to recommend it than sheer efficiency: new insights are given for the fundamental convergence properties of formal decomposition techniques. Several applications of this power interactive tool are discussed.
- Published
- 1987
73. Production and Sales Planning with Limited Shared Tooling at the Key Operation
- Author
-
Gerald G. Brown, Arthur M. Geoffrion, Gordon H. Bradley, and Operations Research (OR)
- Subjects
Mathematical optimization ,Linear programming ,Computer science ,Strategy and Management ,Molding (process) ,flow shop, integer programming applications [inventory/production applications, production] ,Management Science and Operations Research ,Flow network ,Industrial engineering ,Production (economics) ,Relaxation (approximation) ,Integer programming ,Sales and operations planning ,Integer (computer science) ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
Management Science, 27, 3, pp. 247-259, (Nominated for 1981 International Management Science Achievement Award). The focus of this paper is multiperiod production and sales planning when there is a single dominant production operation for which tooling (dies, molds, etc.) can be shared among parts and is limited in availability. Our interest in such problems grew out of management issues confronting an injection molding manufacturer of plastic pipes and fittings for the building and chemical industries, but similar problems abound in the manufacture of many other cast, extruded, molded, pressed, or stamped products. We describe the development and successful application of a planning model and an associated computational approach for this class of problems. The focus of this paper is multiperiod production and sales planning when there is a single dominant production operation for which tooling (dies, molds, etc.) can be shared among parts and is limited in availability. Our interest in such problems grew out of management issues confronting an injection molding manufacturer of plastic pipes and fittings for the building and chemical industries, but similar problems abound in the manufacture of many other cast, extruded, molded, pressed, or stamped products. We describe the development and successful application of a planning model and an associated computational approach for this class of problems.The focus of this paper is multiperiod production and sales planning when there is a single dominant production operation for which tooling (dies, molds, etc.) can be shared among parts and is limited in availability. Our interest in such problems grew out of management issues confronting an injection molding manufacturer of plastic pipes and fittings for the building and chemical industries, but similar problems abound in the manufacture of many other cast, extruded, molded, pressed, or stamped products. We describe the development and successful application of a planning model and an associated computational approach for this class of problems. The focus of this paper is multiperiod production and sales planning when there is a single dominant production operation for which tooling (dies, molds, etc.) can be shared among parts and is limited in availability. Our interest in such problems grew out of management issues confronting an injection molding manufacturer of plastic pipes and fittings for the building and chemical industries, but similar problems abound in the manufacture of many other cast, extruded, molded, pressed, or stamped products. We describe the development and successful application of a planning model and an associated computational approach for this class of problems. The problem is modeled as a mixed integer linear program. Lagrangean relaxation is applied so as to exploit the availability of highly efficient techniques for minimum cost network flow problems and for single-item dynamic lot-sizing type problems. For the practical application at hand, provably good solutions are routinely being obtained in modest computing time to problems far beyond the capabilities of available mathematical programming systems.
- Published
- 1981
- Full Text
- View/download PDF
74. A cost analysis of sampling inspection under Military Standard 105D
- Author
-
Herbert C. Rutemiller and Gerald G. Brown
- Subjects
Binomial distribution ,Acceptance sampling ,Markov chain ,Sample size determination ,Computer science ,media_common.quotation_subject ,Sample (material) ,Statistics ,General Engineering ,Range (statistics) ,Sampling (statistics) ,Quality (business) ,media_common - Abstract
Military Standard 105D has been almost universally adopted by government and private consumers for the lot-by-lot sampling inspection of product which may be inspected on a dichotomoun basis The plan specifies, for each lot size, a random sample size and set of acceptance numbers (maximum allowable number of defectives in each sample). The acceptance numbers are based upon the binomial distribution and depend upon the quality required by the purchaser. Where several consecutive lots are submitted, a shift to less severe (“reduced”) inspection or more severe (“tightened”) inspection is specified when the ongoing quality is very high or low. Further experience permits a return to normal sampling from either of these states This paper examines the long range costs of such a sampling scheme. The three inspection types are considered as three distinct Markov chains, with periodic transitions from chain to chain. The expected sample size and the expected proportion of rejected product are determined as a function of the two parameters under control of the manufacturer, lot size and product quality. Some numerical examples are given which illustrate how to compute the overall cost of sampling inspection. Suggestions are made concerning the choice of parameters to minimize this cost.
- Published
- 1973
75. A Sequential Stopping Rule for Fixed-Sample Acceptance Tests
- Author
-
Gerald G. Brown and Herbert C. Rutemiller
- Subjects
Sequential estimation ,Acceptance testing ,Stopping time ,Statistics ,Conditional probability ,Sample (statistics) ,Optimal stopping ,Management Science and Operations Research ,Decision problem ,Computer Science Applications ,Mathematics ,Weibull distribution - Abstract
The occurrence of early failures in a fixed-sample acceptance test, where the sample observations are obtained sequentially, presents an interesting decision problem. It may be desirable to abandon the test at an early stage if the conditional probability of passing is small and the testing cost is high. This paper presents a stopping rule based on the maximum-likelihood estimate of total costs involved in the decision to continue beyond an early failure. A Bernoulli model, an exponential model, and a Weibull model are examined.
- Published
- 1971
76. Some Probability Problems Concerning the Game of Bingo
- Author
-
Gerald G. Brown and Herbert C. Rutemiller
- Subjects
ComputingMilieux_PERSONALCOMPUTING ,Mathematics education ,Mathematical game ,Game theory ,Recreational mathematics ,Mathematics - Abstract
We Have found the well-known game of bingo to be an excellent device for illustrating to students various laws of probability and a source for hypergeo-metric probability problems to supplement the familiar playing card examples. In particular, solution of the question “What is the expected length of a game of bingo as a function of the number of players?” provides students an opportunity to develop the expectation of a random variable theoretically and then to verify the result by computer simulation of the game.
- Published
- 1973
77. Evaluation of Pr {x⩾y} When Both X and Y are from Three-Parameter Weibull Distributions
- Author
-
Gerald G. Brown and Herbert C. Rutemiller
- Subjects
education.field_of_study ,Weibull modulus ,Cumulative distribution function ,Population ,Mathematical analysis ,Convolution of probability distributions ,Joint probability distribution ,Statistics ,Probability distribution ,Electrical and Electronic Engineering ,Safety, Risk, Reliability and Quality ,education ,Exponentiated Weibull distribution ,Mathematics ,Weibull distribution - Abstract
It is important in many reliability applications to determine the probability that the failure time of an element from one population will exceed that of an element from a second population. In this paper, we present a method for computer calculations of Pr {x ⩾ y} where X and Y are each from a three-parameter Weibull distribution. In addition, we provide the moments and the probability density function of the difference. Numerical examples are included.
- Published
- 1973
78. The Efficiencies of Maximum Likelihood and Minimum Variance Unbiased Estimators of Fraction Defective in the Normal Case
- Author
-
Gerald G. Brown and Herbert C. Rutemiller
- Subjects
Statistics and Probability ,Efficient estimator ,Minimum-variance unbiased estimator ,Mean squared error ,Bias of an estimator ,Estimation theory ,Applied Mathematics ,Modeling and Simulation ,Stein's unbiased risk estimate ,Statistics ,Consistent estimator ,Estimator ,Mathematics - Abstract
This paper compares two point estimators of fraction defective of a normal distribution when both population parameters are unknown; the minimum variance unbiased estimator, (x), and the maximum likelihood estimator, (x). Using minimum mean squared error as a criterion, it is shown that the choice of estimator depends upon the true value of F(x), and the sample size. In the domain .0005 ≤ F(x) ≤ .50, the maximum likelihood estimator is generally superior even for small sample sizes, except for F(x) less than about 0.01, or greater than 0.25. Furthermore, the bias in the m.l.e. is slight over much of the domain where this estimator has smaller mean squared error. As a practical solution to the estimation problem. it is suggested that the m.v.u.e. be calculated, and if this estimate is between 0.01 and 0.25, it should be replaced with the m.l.e. This combined estimator is shown to be nearly as efficient as the better of the m.v.u.e. and m.l.e. throughout the domain of F(x).
- Published
- 1973
79. Scheduling Ocean Transportation of Crude Oil
- Author
-
Gerald G. Brown, David Ronen, Glenn W. Graves, and Operations Research (OR)
- Subjects
Ballast ,Schedule ,Mathematical optimization ,Transportation planning ,Opportunity cost ,Job shop scheduling ,Computer science ,business.industry ,Strategy and Management ,planning, set partitioning, enumerative methods [transportation] ,Scheduling (production processes) ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Fuel oil ,Management Science and Operations Research ,Demurrage ,Scheduling (computing) ,chemistry.chemical_compound ,chemistry ,Petroleum industry ,Fuel efficiency ,Petroleum ,Elasticity (economics) ,business ,Integer programming - Abstract
Management Science, 33, p. 335-346. (Nominated for 1987 International Management Science Achievement Award.) The article of record as published may be found at http://links.jstor.org/sici?sici=0025-1909%28198703%2933%3A3%3C335%3ASOTOCO%3E2.0.CO%3B2-F Nominated for 1987 International Management Science Achievement Award. A crude oil tankerscheduling problem faced by a major oil company is presented and solved using an elastic set partitioning model. The model takes into account all fleet cost components, including the opportunity cost of ship time, port and canal charges, and demurrage and bunker fuel. The model determines optimalspeeds for the ships and the best routing of ballast (empty) legs, as well as which cargos to load on controlled ships and which to spot charter. All feasible schedules are generated, the cost of each is accurately determined and the best set of schedules is selected. For the problems encountered, optimal integer solutions to set partitioning problems with thousands of binary variables have been achieved in less than a minute.
- Published
- 1987
80. On Random Binary Trees
- Author
-
Bruno O. Shubert and Gerald G. Brown
- Published
- 1976
81. ATHENA: Users Manual for Interactive Analysis of Large-Scale Optimization Models
- Author
-
Panagiotis I. Galatas, Gerald G. Brown, and Gordon H. Bradley
- Subjects
User Friendly ,Parsing ,Database ,business.industry ,Fortran ,Computer science ,Interface (computing) ,Computer file ,Directory ,computer.software_genre ,Query language ,Software ,Software engineering ,business ,computer ,computer.programming_language - Abstract
Analyses of solutions for large-scale optimization models are very difficult without effective computer aids. Solution reports may require weeks to design, implement and produce with conventional report writing systems. The reports produced are voluminous, often exceeding 100,000 printed lines, and are thus quite awkward to access manually. Timely and economic analysis of solutions to large models is further hindered by inflexible and costly report writing software and procedures. ATHENA has been developed to allow extremely efficient immediate interactive storage and analysis of the solution file from any optimization system. ATHENA is easy to learn and use; user friendly features are provided which can preemptively assess the potential cost and implications of each request for solution information, assist the confused user, and provide the required solution information with very fast response time. The user is provided with extensive search under mask and compound logical relational constructs, as well as the capability to quickly diagnose suspicious model symptoms, and to format and issue offline reports. ATHENA is implemented in portable FORTRAN, with a parser and interpreter easily modified and expanded to suit particular hardware environments and user demands. The system has been initially designed and tuned for large-scale problems with up to 30,000 rows and columns. Live test demonstrations show that the system exhibits very fast response time in actual use. This report presents a users manual for the prototype ATHENA query language, an error message directory, and a description of interface and extension provisions.
- Published
- 1980
82. AUTOMATIC IDENTIFICATION OF EMBEDDED STRUCTURE IN LARGE-SCALE OPTIMIZATION MODELS
- Author
-
William G. Wright and Gerald G. Brown
- Subjects
Mathematical optimization ,Identification (information) ,Linear programming ,Scale (ratio) ,Branch and price ,Criss-cross algorithm ,Algorithm ,Integer programming ,Linear complementarity problem ,Mathematics ,Linear-fractional programming - Abstract
appears in Large-Scale Linear Programming, eds. Dantzig, G., et al., IIASA, Laxenburg, Austria, pp. 89-93.
- Published
- 1981
83. Canonical Methods in the Solution of Variable-Coefficient Lanchester-Type Equations of Modern Warfare
- Author
-
James G. Taylor, Gerald G. Brown, and Naval Postgraduate School
- Subjects
Mathematical theory ,Variable coefficient ,Mathematical optimization ,Homogeneous ,Homogeneity (statistics) ,Applied mathematics ,Management Science and Operations Research ,Type (model theory) ,Computer Science Applications ,Power (physics) ,Mathematics - Abstract
This paper develops a mathematical theory for solving deterministic, Lanchester-type, 'square-law' attrition equations for combat between two homogeneous forces with temporal variations in fire effec- tivenesses (as expressed b y the Lanchester attrition-rate coefficients). It gives a general form for expressing the solution of such variable-coefficient combat attrition equations in terms of Lanchester func- tions, which are introduced here and can be readily tabulated. Different Lanchester functions arise from different mathematical forms for the attrition-rate coefficients. W e give results for two such forms: (1) effectiveness of each side's fire proportional to a power of time, and (2) effectiveness of each side's fire linear with time but with a nonconstant ratio of attrition-rate coefficients. Previous results in the literature for a nonconstant ratio of these attrition-rate coefficients only took a convenient form under rather restrictive conditions. This research was supported by the Office of Naval Research as part of the Foundation Research Program at the Naval Postgraduate School.
- Published
- 1976
84. Means and Variances of Stochastic Vector Products with Applications to Random Linear Models
- Author
-
Herbert C. Rutemiller and Gerald G. Brown
- Published
- 1977
85. A Table of Lanchester-Clifford-Schlaefli Functions
- Author
-
James G. Taylor and Gerald G. Brown
- Subjects
Engineering ,Firepower ,Operations research ,business.industry ,Technical report ,Table (database) ,business - Abstract
supported by the U.S. Army Research Office, Durham, North Carolina with R&D project No. IL161102BH57-05 Math
- Published
- 1977
86. Numerical Determination of the Parity-Condition Parameter for Lanchester-Type Equations of Modern Warfare
- Author
-
Gerald G. Brown, James G. Taylor, and Operations Research (OR)
- Subjects
Mathematical optimization ,Offset (computer science) ,General Computer Science ,Numerical analysis ,Single parameter ,Parity (physics) ,Management Science and Operations Research ,Modern warfare ,Nonlinear Sciences::Adaptation and Self-Organizing Systems ,General theory ,Homogeneous ,Modeling and Simulation ,Applied mathematics ,Parametric statistics ,Mathematics - Abstract
Computers and Operations Research, 5, 4, pp. 227-242. This paper presents a simple numerical procedure for determining the parity-condition parameter for deterministic Lanchester-type combat between two homogeneous forces. Deterministic dierentidequation combat models are commonly used in parametric studies for computational reasons, since they give essentially the same results for the mean course of combat as do corresponding stochastic attrition models. The combat studied in this paper is modelled by Lanchester-type equations of modern warfare with timedependent attrition-rate coefficients. Previous research has generalized Lanchester's classic "square law" to such variable-coefficient combat. It has shown that the prediction of battle outcome (in particular, force annihilation) without having to spend the time and effort of computing force-level trajectories depends on a single parameter, the so-called parity-condition parameter, which is "the enemy force equivalent of a friendly force of unit strength" and depends on only the attrition-rate coefficients. Unfortunately, previous research did not show generally how to determine this parameter. We present general theoretical considerations for its numerical noniterative determination. This general theory is applied to an important class of attrition-rate coefficients (offset power-rate coefficients). Our results allow one to study such variable-coefficient combat models almost as easily and thoroughly as Lanchester's classic constant-coefficient model.
- Published
- 1978
87. A Short Table of Lanchester-Clifford-Schlafli Functions
- Author
-
James G. Taylor and Gerald G. Brown
- Subjects
Offset (computer science) ,Mathematical model ,Parametric analysis ,business.industry ,Computer science ,Homogeneous ,Numerical analysis ,Applied mathematics ,Artificial intelligence ,business - Abstract
This report contains a reduced set of tables of Lanchester-Clifford-Schlafli (LCS) functions. A companion report contains a more extensive (and currently the most extensive available) set of tables of the LCS functions. These functions may be used to analyze Lanchester-type combat between two homogeneous forces modelled by power attrition-rate coefficients with no effect. Theoretical background for the LCS functions is given, as well as a narrative description of the physical circumstances under which the associated Lanchester-type combat model may be expected to be applicable. Numerical examples are given to illustrate the use of the LCS functions for analyzing aimed-fire combat modelled by the power attrition-rate coefficients with no offset. Our results and these tabulations allow one to study this particular variable-coefficient combat model almost as easily and thoroughly as Lanchester's classic constant-coefficient model. (Author)
- Published
- 1977
88. Mobilizing Marine Corps Officers
- Author
-
Richard E. Rosenthal, Stephen H. Rapp, Danny R. Hundley, Gerald G. Brown, and Dan O. Bausch
- Published
- 1989
89. On Random Binary Trees
- Author
-
Gerald G. Brown, Bruno O. Shubert, and Operations Research (OR)
- Subjects
Discrete mathematics ,Binary tree ,General Mathematics ,Weight-balanced tree ,Management Science and Operations Research ,Random binary tree ,Computer Science Applications ,Treap ,Combinatorics ,Binary search tree ,Geometry of binary search trees ,Binary expression tree ,Self-balancing binary search tree ,Mathematics - Abstract
A widely used class of binary trees is studied in order to provide information useful in evaluating algorithms based on this storage structure. A closed form counting formula for the number of binary trees with n nodes and height k is developed and restated as a recursion more useful computationally. A generating function for the number of nodes given height is developed and used to find the asymptotic distribution of binary trees. An asymptotic probability distribution for height given the number of nodes is derived based on equally likely binary trees. This is compared with a similar result for general trees. Random binary trees (those resulting from a binary tree sorting algorithm applied to random strings of symbols) are counted in terms of the mapping of permutations of n symbols to binary trees of height k. An explicit formula for this number is given with an equivalent recursive definition for computational use. A generating function is derived for the number of symbols given height. Lower and upper bounds on random binary tree height are developed and shown to approach one another asymptotically as a function of n, providing a limiting expression for the expected height. The random binary trees are examined further to provide expressions for the expectations of the number of vacancies at each level, the distribution of vacancies over all levels, the comparisons required for insertion of a new random symbol, the fraction of nodes occupied at a particular level, the number of leaves, the number of single vacancies at each level, and the number of twin vacancies at each level. A random process is defined for the number of symbols required to grow a tree exceeding any given height. Finally, an appendix is given with sample tabulations and figures of the distributions.
- Published
- 1984
90. Automatic identification of embedded network rows in large-scale optimization models
- Author
-
Gerald G. Brown, William G. Wright, and Operations Research (OR)
- Subjects
Optimization ,Special ordered set ,Mathematical optimization ,Computational Complexity ,Computational complexity theory ,Heuristic (computer science) ,General Mathematics ,Scale (descriptive set theory) ,Generalized Upper Bounds ,Set (abstract data type) ,Mixed Integer ,Basis Factorization ,Networks ,Coefficient matrix ,Large-Scale Optimization ,Integer programming ,Software ,Mathematics ,Integer (computer science) - Abstract
The solution of a large-scale linear, integer, or mixed integer programming problem is often facilitated by the exploitation of special structure in the model. This paper presents heuristic algorithms for identifying embedded network rows within the coefficient matrix of such models. The problem of identifying a maximum-size embedded pure network is shown to be among the class of NP-hard problems. The polynomially-bounded, efficient algorithms presented here do not guarantee network sets of maximum size. However, upper bounds on the size of the maximum network set are developed and used to show that our algorithms identify embedded networks of close to maximum size. Computational tests with large-scale, real-world models are presented.
- Published
- 1984
91. Letter to the Editor—In Remembrance of Richard Clasen
- Author
-
Gerald G. Brown
- Subjects
Letter to the editor ,media_common.quotation_subject ,Art history ,Art ,Management Science and Operations Research ,Computer Science Applications ,media_common - Published
- 1984
92. Response to Ezell and von Winterfeldt.
- Author
-
Gregory S. Parnell, Luciana L. Borio, Louis A. (Tony) Cox, Gerald G. Brown, Stephen Pollock, and Alyson G. Wilson
- Published
- 2009
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.