29 results
Search Results
2. AN EVALUATION OF THE EMPIRICAL SIGNIFICANCE OF OPTIMAL SEEKING ALGORITHMS IN PORTFOLIO SELECTION.
- Author
-
PORTER, R. BURR and BEY, ROGER P.
- Subjects
PORTFOLIO management (Investments) ,INVESTMENT analysis ,PORTFOLIO performance ,SECURITIES ,INVESTMENTS ,ALGORITHMS - Abstract
The mean-variance (EV) portfolio selection rule has recently been challenged by a new procedure referred to as the Stochastic Dominance Rule (SD), which is touted as being theoretically and empirically superior. However, the SD procedure suffers from at least one technical deficiency not associated with the EV model--the lack of a search algorithm that builds efficient combinations of assets. The purpose of this paper is to evaluate the significance of this problem and to consider procedures for its alleviation. [ABSTRACT FROM AUTHOR]
- Published
- 1974
- Full Text
- View/download PDF
3. Hash Coding with a Non-Unique Search Key.
- Author
-
Bookstein, Abraham
- Subjects
HASHING ,COMPUTER storage devices ,ELECTRONIC file management ,INFORMATION storage & retrieval systems ,ALGORITHMS ,INFORMATION retrieval ,INFORMATION science - Abstract
This paper defines a hash coding model for non-unique search keys and derives the expected number of accesses needed to retrieve all desired records from a computer storage device. The assumption that the records are stored on the basis of a non-unique key is often realized in information retrieval environments. The model assumes that the hashing algorithm and, should a collision occur, the skipping algorithm, both distribute the records randomly in memory. The results of this analysis are compared with those from a simulation in which the randomness criterion is not strictly met. [ABSTRACT FROM AUTHOR]
- Published
- 1974
- Full Text
- View/download PDF
4. Journal Disposition Decision Policies.
- Author
-
Rush, Barbara, Steinberg, Sam, and Kraft, Donald H.
- Subjects
PERIODICALS ,ACADEMIC libraries ,DOCUMENTATION ,ALGORITHMS ,CAPITAL investments ,INVESTMENTS - Abstract
The problem of whether to bind, microcopy or discard back issues of journals in a university library branch system is considered and an algorithm is developed to solve the journal disposition problem. A cost-effectiveness approach is pursued, employing an analytical model. This model uses a set of weighted factors to quantify the value of a specific journal to the library. Such factors as relevance, usage, availability elsewhere and capital investment are specified. Budget constraints, which involve the relevant costs of binding and microcopying, are considered, as are upper and lower threshold values on the worth of a journal to guarantee the retention of exceptionally good journals and the disposal of very poor ones. An example based on data from a real university special library situation is presented as an illustration of the model. Thus, this paper extends the work of others in modeling of the library collection development decision process. [ABSTRACT FROM AUTHOR]
- Published
- 1974
- Full Text
- View/download PDF
5. Search Strategy, Construction and Use of Citation Networks, with a Socio-Scientific Example: "Amorphous Semi-Conductors and S.R. Ovshinsky"
- Author
-
Cawkell, Anthony E.
- Subjects
GRAPHIC methods ,ALGORITHMS ,SEMICONDUCTORS ,CHARTS, diagrams, etc. ,GEOMETRICAL drawing ,ALGEBRA - Abstract
The deductions which may be made from inspection of a citation map of the literature are briefly described, and a search algorithm is given for the construction of such maps. Following the construction and assessment of a network describing the development of amorphous semi- conductors, it is concluded that citation diagrams are a valuable aid for socio-scientific studies. [ABSTRACT FROM AUTHOR]
- Published
- 1974
- Full Text
- View/download PDF
6. PORTFOLIO THEORY WHEN INVESTMENT RELATIVES ARE LOGNORMALLY DISTRIBUTED.
- Author
-
ELTON, EDWIN J. and GRUSE, MARTIN J.
- Subjects
INVESTMENTS ,ALGORITHMS ,UTILITY functions ,DISTRIBUTION (Probability theory) ,RATE of return ,RISK aversion ,MATHEMATICAL reformulation ,MARGINAL utility ,INVESTMENT analysis ,PORTFOLIO management (Investments) - Abstract
The article proposes to reformulate portfolio theory under the assumption that investment relatives are lognormally distributed. Portfolio theory is shown to be correct under assumptions that returns are normally distributed and the investor's utility function is quadratic. Several reasons for questioning and challenging these assumptions are discussed, such as the unrealistic implications of the utility function. The authors present an algorithm for determining portfolios in an efficient set theorem and discuss the implications for the marginal utility of investors.
- Published
- 1974
- Full Text
- View/download PDF
7. An Alogorithm for Generating Structural Surrogates of English Text.
- Author
-
Strong, Suzanne M.
- Subjects
COMPUTER algorithms ,ALGORITHMS ,SYNTAX (Grammar) ,GRAMMAR ,COMPUTER programming ,VOCABULARY - Abstract
This paper describes the development and application of an algorithm which generates non-linear representations of English text. The algorithm uses the results of a syntactic analysis system and a set of rules which prescribe linkages to generate a graph of a sentence. The shape of these graphs corresponds to the syntax of the sentence; the labels correspond to the vocabulary of the sentence and the edge types correspond to case grammar roles. The sentence graphs can then be interconnected at common nodes and analyzed according to common edges. Preliminary experimentation has yielded promising results. It appears that the algorithm produces a representation of English text which could be quite useful in automatic language processing. [ABSTRACT FROM AUTHOR]
- Published
- 1974
- Full Text
- View/download PDF
8. A PRICE SCHEDULES DECOMPOSITION ALGORITHM FOR LINEAR PROGRAMMING PROBLEMS.
- Author
-
Jennergren, Peter
- Subjects
MATHEMATICAL decomposition ,ALGORITHMS ,PRICING ,PRODUCTION scheduling ,LINEAR programming ,MATRICES (Mathematics) ,LINEAR substitutions ,MATHEMATICAL transformations ,RESOURCE allocation - Abstract
It is known that prices only cannot usually be utilized to coordinate a linear economic system. This paper considers a linear economic system. formally represented as a linear programming model which is interpreted as a resource-allocation problem. An algorithm founded on the idea of associating with each resource a linearly increasing price schedule rather than a constant price is developed. The paper hence demonstrates that a mechanism rather similar to a pure price mechanism can be used both to find and sustain an optimal allocation of resources in a linear economic system. [ABSTRACT FROM AUTHOR]
- Published
- 1973
- Full Text
- View/download PDF
9. MAXIMIZATION BY QUADRATIC HILL-CLIMBING.
- Author
-
Goldfield, Stephen M., Quandt, Richard E., and Trotter, Hale F.
- Subjects
APPROXIMATION theory ,QUADRATIC equations ,MATHEMATICAL functions ,ALGORITHMS ,FUNCTIONAL analysis ,MATHEMATICS ,ECONOMETRICS ,ECONOMICS ,ECONOMIC models ,MATHEMATICAL economics ,MATHEMATICAL models ,ECONOMETRIC models - Abstract
The purpose of this paper is to describe a new gradient method for maximizing general functions. After a brief discussion of various known gradient methods the mathematical foundation is laid for the new algorithm which rests on maximizing a quadratic approximation to the function on a suitably chosen spherical region. The method requires no assumptions about the concavity of the function to be maximized and automatically modifies the step size in the light of the success of the quadratic approximation to the function. The paper further discusses some practical problems of implementing the algorithm and presents recent computational experience with it. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
10. Effectiveness of Retrieval Key Abbreviation Schemes.
- Author
-
Lowe, Thomas C.
- Subjects
INFORMATION retrieval ,ALGORITHMS ,DOCUMENTATION ,INFORMATION storage & retrieval systems ,MATHEMATICAL transformations ,INFORMATION science - Abstract
Frequently it is useful to abbreviate or otherwise transform keys used for the retrieval of information. These transformations include the compression of long keys into a fixed field length by operations on characters or groups of characters, hash or random transformations in order to obtain a direct address, or phonetic coding to order to group together keys that are in some way similar. The various transformations have differing effects on file retrieval schemes. Given a transformation algorithm and data to be transformed, it is possible to characterize certain qualities of the algorithm that re late to retrieval problems. This paper is concerned with some measures of effectiveness of such transformation algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 1971
- Full Text
- View/download PDF
11. Error Evaluation for Stemming Algorithms as Clustering Algorithms.
- Author
-
Lovins, Julie B.
- Subjects
ALGORITHMS ,MATHEMATICS ,ALGEBRA ,INFORMATION retrieval ,FOUNDATIONS of arithmetic ,SUBJECT headings ,QUERY (Information retrieval system) - Abstract
This paper presents mathematical evaluation measures to characterize the effect of known erroneous performance by stemming routines, and generalizes these procedures to other types of nonstatistical clustering algorithms. When clusters, or groups of intrinsically related elements, are split into smaller groups (by under-matching the elements), there is a loss in recall in information retrieval; larger groups (caused by over- matching) induce a loss in precision or relevance. The magnitude of error is taken to be a function of frequencies of cluster elements. When these ore words in a subject-term index generated by a stemming algorithm, retrieval capability is also affected by the strength of the algorithm, the size and content of the stemmed index, and the number of words in a query. The present Project Intrex stemming algorithm has estimated stemming-error losses of 4% in recall and 1% in relevance on one-word queries; the former could be reduced to almost zero by straightforward corrections of known errors in the algorithm. An expanded probabilistic model is introduced to handle a more general case in which any element need not belong unambiguously to a single cluster. Error evaluation in document classification and thesauri also is discussed in broad terms. [ABSTRACT FROM AUTHOR]
- Published
- 1971
- Full Text
- View/download PDF
12. INVENTORY MODELS: OPTIMIZATION BY GEOMETRIC PROGRAMMING.
- Author
-
Kochenberger, Gary A.
- Subjects
MATHEMATICAL programming ,ALGORITHMS ,MATHEMATICS ,POLYNOMIALS ,APPROXIMATION theory ,TECHNOLOGY - Abstract
Geometric programming is a mathematical programming technique that is designed to determine the constrained minimum value of a generalized polynomial objective function. To date, most applications of the technique have been restricted to certain classes of engineering problems. This paper presents a brief summary of geometric programming and then illustrates its application to managerial problems by applying it to three well-known inventory models. [ABSTRACT FROM AUTHOR]
- Published
- 1971
- Full Text
- View/download PDF
13. DECOMPOSITION OF PLANNING SYSTEMS.
- Author
-
Chaiho Kim
- Subjects
PLANNING ,ALGORITHMS ,MATHEMATICAL decomposition ,ORGANIZATIONAL behavior ,DECENTRALIZATION in management ,OPERATIONS research - Abstract
While Dantzig and Wolfe have formulated the decomposition algorithm as a computational device, it was subsequently recognized that the principle underlying the algorithm may also be utilized by a planning system to achieve a centralized planning within the framework of a decentralized organizational structure. This paper illustrates the workings of such a planning system and, at the same time, explores some of the problems connected with its actual implementation. The planning system under the study is called the decomposed planning system in order to distinguish it from either the centralized planning or the decentralized planning. [ABSTRACT FROM AUTHOR]
- Published
- 1970
- Full Text
- View/download PDF
14. The computer and the analysis of myths.
- Author
-
Maranda, Pierre
- Subjects
MYTHOLOGY ,COMPUTERS ,ELECTRONIC data processing ,IMAGINATION ,ALGORITHMS - Abstract
The article focuses on the computer and the analysis of myths. The human mind can explore its own workings as revealed in two of its more powerful manifestations, namely, the myth and the computer, in each of which it is possible to detect echoes of the other. According to E.B. Tylor, sociologist, among those opinions which are produced by a little knowledge, to be dispelled by a little more, is the belief in an almost boundless creative power of the human imagination. The treatment of similar myths from different regions, by arranging them in large compared groups, makes it possible to trace in mythology the operation of imaginative processes recurring with the evident regularity of mental law, and thus stories of which a single instance would have been a mere isolated curiosity, take their place among well-marked and consistent structures of the human mind. Tylor wrote these lines a hundred years ago. No instrument lends itself better, in principle at any rate, to this sort of rigorous intellectual self-contemplation than the computer. The general purpose of the paper is to study the contribution of data-processing in the study of mythology. The construction of algorithms in mythological studies is a complicated business, but computers can do it with ease. The computer is eminently capable of exploring operative ambiguity.
- Published
- 1971
15. Discussion of Testing a Prediction Method for Multivariate Budgets.
- Author
-
Brown, Arthur A.
- Subjects
MULTIVARIATE analysis ,BUDGET ,ALGORITHMS ,LOGARITHMS ,MATHEMATICAL statistics - Abstract
The article comments on the paper "Testing a Prediction Method for Multivariate Budgets," by Baruch Lev, that appears in the December 1, 1969 issue of the "Journal of Accounting Research." The author states that Lev discusses the need for developing a required algorithm for the calculation and the capability of planning marginal totals on the basis of broad constraints. The author explains that Lev's method of final adjustment of his cell entries does not adjust the information ratio because the required algorithm for calculation has not been developed.
- Published
- 1969
- Full Text
- View/download PDF
16. The Grammar of Sociology.
- Author
-
McGinnis, Robert
- Subjects
SET theory ,SOCIAL science methodology ,SOCIOLOGICAL research ,DATA analysis ,ALGORITHMS - Abstract
The article comments on the applicability of the set theory in sociological research. The author notes that Donal E. Muir's paper "Searching Data for Predictive Variables: A Set-Theoretical Approach" contributes to the improved scientific status of sociology specifically by investigating one of its most ubiquitous and problematic predicates. In doing this, Muir displays the utility of set theory as a primitive but formidable tool for the construction of sociological sentences. The author further points out that set theory is a rich and sophisticated language, with an array of conjunctions with which to build complicated subject and object sets. Above all, it offers a variety of unambiguous predicates with which to build logically meaningful theory, that is, theory that has implications, which can be proven to be true by the rules of deductive logic. The article further reveals that Muir has attacked the basic scientific predicate, cause, and has come up with a useful if not revolutionary algorithm. He has moved toward the improvement of sociological sentence structure.
- Published
- 1969
- Full Text
- View/download PDF
17. RATES OF RETURN ON COMMON STOCK PORTFOLIOS OF LIFE INSURANCE COMPANIES: ADDENDUM.
- Author
-
Gentry, James A.
- Subjects
RATE of return ,RATE of return on stocks ,STOCKS (Finance) ,INSURANCE companies ,INSURANCE rates ,INVESTMENTS ,LIFE insurance companies ,ALGORITHMS ,INTERNAL rate of return - Abstract
The objectives of this article are to present a revised rate of return equation; to present corrected rates of return on the common stock portfolios of thirty-two life companies and to update the previous analysis. The annual rate of return on the equity portfolios of life insurance companies is a key variable in this article. The rate of return equation is based on the professor Lawrence Fisher algorithm, which produces an internal rate of return on the portfolio. The equation takes into account changes in market value, purchases, sales, and dividends of the portfolio. This method allows for continuous reinvestment of net contributions. Furthermore, it is assumed dividends are automatically reinvested and, therefore, imbedded in the total value of the annual purchases. The return is the annual compound rate earned and is a result of price changes and flow of funds in the portfolio. Because the internal rate of return solution involves a trial and error approach, the Fisher algorithm utilizes an iterative process in determining an exact rate of return.
- Published
- 1971
- Full Text
- View/download PDF
18. A PARAMETRIC SIMPLICIAL FORMULATION OF HOUTHAKKER'S CAPACITY METHOD.
- Author
-
van de Panne, C. and Whinston, Andrew
- Subjects
QUADRATIC programming ,CAPACITY theory (Mathematics) ,NONLINEAR programming ,PLURIPOTENTIAL theory (Mathematics) ,MATHEMATICAL programming ,ALGORITHMS ,MATHEMATICAL optimization ,ECONOMICS ,ECONOMETRICS ,MATHEMATICAL economics - Abstract
The paper reformulates Houthakker's capacity method for quadratic programming in the framework of the simplex and dual methods for quadratic programming, thereby greatly reducing the conceptual and computational complexities of the method. It is shown that the method is applicable for all convex quadratic programming problems, including the case of a semi-definite matrix of the quadratic form and that of constraints in equality form. In the linear programming case the method reduces to a parametric version of the dual method. [ABSTRACT FROM AUTHOR]
- Published
- 1966
- Full Text
- View/download PDF
19. Concave Programming Applied to Rice Mill Location.
- Author
-
Candler, Wilfred, Snyder, James C., and Faught, William
- Subjects
ALGORITHMS ,MATHEMATICAL programming - Abstract
This paper provides an intuitive discussion of a concave programming algorithm capable of eventually reaching a global optimum. Two local optima are reported for a concave programming problem with 427 restraints, 4089 linear transportation activities, and 16 non-linear falling average cost milling activities. [ABSTRACT FROM AUTHOR]
- Published
- 1972
- Full Text
- View/download PDF
20. ON THE EXISTENCE OF A COST OF CAPITAL UNDER PURE CAPITAL RATIONING.
- Author
-
BURTON, R. M. and DAMON, W. W.
- Subjects
CAPITAL costs ,ALGORITHMS ,RATIONING ,CAPITAL investments ,DISCOUNT prices ,MATHEMATICAL programming ,INVESTMENT analysis ,ECONOMETRIC models ,EXTERNALITIES ,INTEREST rates - Abstract
The article demonstrates the existence of a well-defined concept for the cost of capital under pure capital rationing. Academic debate concerning the appropriate discount rate to be applied to cost of capital models has flourished since the applicability of mathematical programming to capital budgeting problems was demonstrated by Professor H. Martin Weingartner. The authors present a general proof of a reformulation of the Weingartner model for pure capital rationing. It is argued that the model should be extended to include measures from the external market.
- Published
- 1974
- Full Text
- View/download PDF
21. A COMMENT ON SYMIV: AN ALGORITHM FOR THE INVERSION OF A POSITIVE DEFINITE MATRIX BY THE CHOLESKY DECOMPOSITION.
- Author
-
Stewart, J.
- Subjects
ALGORITHMS ,MATHEMATICAL decomposition ,PROBABILITY theory ,MATHEMATICS ,MATHEMATICAL statistics ,MATHEMATICAL economics ,ECONOMICS ,ESTIMATION theory ,STOCHASTIC processes - Abstract
The article comments on the paper "Syminv: An Algorithm for the Inversion of a Positive Definite Matrix by the Cholesky Decomposition," by Terry Seaks, published in volume 40 of "Econometrica." In his article Seaks describes an algorithm for the Cholesky decomposition of a positive symmetric matrix, citing J.H. Wilkinson's work. The author explains that the relevant conclusions in Wilkinson's paper are those for floating point calculation and that the numerical advantage of the Cholesky procedure depends on being able to accumulate inner products accurately.
- Published
- 1974
- Full Text
- View/download PDF
22. A Clustering Algorithm Based on User Queries.
- Author
-
Yu, Clement T.
- Subjects
COMPUTER science ,ALGORITHMS ,QUESTIONS & answers ,COMPUTER programming ,INFORMATION science ,INFORMATION technology - Abstract
A clustering algorithm which is tree-like in structure, and is based on user queries, is presented. It is compared to Bonner's Method, Rocchio's Method, Dattola's Method and the Single Link Method in three different aspects, namely system effectiveness, system efficiency and the time required for clustering. Experimental results using the Cranfield 424 collection indicate that the proposed method is superior to the other methods. [ABSTRACT FROM AUTHOR]
- Published
- 1974
- Full Text
- View/download PDF
23. A COMPUTING AID TO A HOUSING MIX PROBLEM.
- Author
-
LOCKETT, GEOFF
- Subjects
HOUSING ,HOUSING forecasting ,COMPUTER software ,URBAN planning ,LAND use planning ,HOUSING policy ,ALGORITHMS ,DECISION making ,PROBLEM solving ,STRATEGIC planning - Abstract
The article presents a model which assists in the problem of planning in urban housing. A description of the difficulties surrounding the problem and its causes are discussed. The computer program associated with the model is discussed, including the linear programming models it utilizes. Information regarding other construction programs that may be used with the model as well as the overall usefulness of the model is discussed. Various decision-making situations in which the model may be of use are also considered.
- Published
- 1973
- Full Text
- View/download PDF
24. Operation Manual for the IBM 7090 Exclusive Stratification Program.
- Subjects
HIERARCHY (Linguistics) ,COMPUTER software ,ELECTRONIC data processing ,BATCH processing ,ALGORITHMS ,INFORMATION processing - Abstract
The article presents information on literature related to the automatic stratification of descriptors. The exclusive stratification process and algorithms are defined in the paper entitled Automatic Stratification of Descriptors by David Lefkovitz. The computer program written to perform this process is coded in FORTRAN II and an object program has been produced. The Operation Manual for the IBM 7090 Exclusive Stratification Program gives a very brief description of the function that this program performs and indicates the manner in which it can be used for large scale processing of data, including such features as batch processing, manual interrupt and restart, English language or numerically coded output, and a few other printout and processing options.
- Published
- 1964
25. Black Ghetto Residents as Rioters.
- Author
-
Moinat, Sheryl M., Raine, Walter J., Burbeck, Stephen L., and Davison, Keith K.
- Subjects
RIOTS ,SOCIOLOGY ,ALGORITHMS ,PSYCHOLOGY ,CRIMES against public safety - Abstract
Los Angeles riot participants, both actual and "psychological," were compared with nonparticipants to see if participation could be predicted, using stepwise multiple linear regression analysis and a second form of linear regression analysis (Wood's algorithm). Data were from 586 interviews of black residents, representing a random sample of the riot curfew area. Neither active nor psychological riot participation could be predicted when age and sex were not controlled, but significant prediction was possible when the population was divided into four groups by age and sex. The usefulness of the regression in characterizing rioters versus nonrioters is limited because a large number of independent variables is needed. Results support the theory that rioting is a community phenomenon. [ABSTRACT FROM AUTHOR]
- Published
- 1972
- Full Text
- View/download PDF
26. Format Recognition: A Report of a Project at the Library of Congress.
- Author
-
Maruyama, Lenore S.
- Subjects
BIBLIOGRAPHY ,INFORMATION retrieval ,INFORMATION organization ,DOCUMENTATION ,COMPUTER software ,ALGORITHMS - Abstract
An experiment using a computer to assign content designators to unedited machine readable bibliographic data to create MARC records is described. Input typing conventions are briefly discussed. A computer program (Assembler Language for DOS) is being developed which analyzes unedited data according to predefined algorithms and builds MARC records. A manual simulation using 150 catalog records was done to test the computer algorithms. The results of the test in terms of accuracy and throughput compared favorably with the current MARC input system in use at the Library of Congress. [ABSTRACT FROM AUTHOR]
- Published
- 1971
- Full Text
- View/download PDF
27. Computers Are for Concepts.
- Author
-
Reno, Charles
- Subjects
INSTRUCTIONAL systems ,EDUCATIONAL technology ,CREATIVE ability ,ALGORITHMS ,TEACHERS ,EDUCATION - Abstract
The article discusses how the computer is used to extend the teacher's facilities and capabilities. The computer provides three important advantages in helping to understand concepts. It provides the opportunity for discovery, exploration, and extension, opportunities that are often lacking with ordinary instructional methods. Computerized instruction not only requires more precise communication, but also offers a chance to be creative. This creativity might be expressed by the construction of an algorithm to solve a given problem, or of one to explore a new topic.
- Published
- 1970
- Full Text
- View/download PDF
28. The Effectiveness of Automatically Generated Weights and Links In Mechanical Indexing.
- Author
-
Artandi, Susan and Wolf, Edward H.
- Subjects
AUTOMATIC indexing ,ALGORITHMS ,TEXT files ,ABSTRACTS ,INDEXING ,INFORMATION storage & retrieval systems - Abstract
Work concerned with the statistical evaluation of the output of the MEDICO automatic indexing method is described, The statistical tests were designed primarily to examine the validity of the assumptions which formed the bases of the algorithms developed for the automatic computation of weights and for the automatic generation of links between index terms and modifiers. This evaluation also includes a comparison of the output generated from full text and from the processing of the abstracts or summaries of the same articles. [ABSTRACT FROM AUTHOR]
- Published
- 1969
- Full Text
- View/download PDF
29. Performance of Kilgour's Truncation Algorithm in Files of Different Subjects.
- Author
-
Kjell, Bradley
- Subjects
ALGORITHMS ,LIBRARIES ,LITERATURE ,DOCUMENTATION ,CATALOGS ,RESEARCH - Abstract
The article informs that researcher Frederick Kilgour's algorithm for forming search keys for automated library catalogs was tested in a file of about 2000 books in science and technology and in a file of about 2000 books in art and literature. Both 3-3 and 2-2 author-title search keys were tested. It was found that both types of search keys yielded a significantly greater number of false-drops in the file of science and technology than in the file of art and literature. Frederick Kilgour and his associates have developed an algorithm for forming search keys for automated library catalogs. These search keys consist of the first few letters of the author's last name concatenated with the first few letters of the first non-article word of the title.
- Published
- 1974
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.