25 results
Search Results
2. Introduction to Programming in LISP.
- Author
-
King, Margaret
- Abstract
The first section of this course on programming introduces LISP as a programming language designed to do symbol manipulation, with consequent prevalence in auto-instructional (AI) work. A programming language specifies in its description a number of primitive operations which the computer knows how to perform. An algorithm is the set of instructions which constitutes a description of the method by which the task can be performed. This algorithm must be written in the programming language which the computer can deal with. The constructs (elements of LISP) that work in algorithms are described and an example task provided. LISP is a functional language, in that its instructions consist of directives to apply some function to some argument. Part 1 of the course concludes with a description of some of LISP's functions, and Part 2 deals with its symbol manipulation functions, specifically in manipulating LISP constructs known as lists. Part 3 gives instruction in writing inherently recursive functions non-recursively. Practice exercises follow each section of the course. (CLK)
- Published
- 1976
3. Solving the Quadratic Capacitated Facilities Location Problem by Computer.
- Author
-
Cote, Leon C. and Smith, Wayland P.
- Abstract
Several computer programs were developed to solve various versions of the quadratic capacitated facilities location problem. Matrices, which represent various business costs, are defined for the factors of sites, facilities, customers, commodities, and production units. The objective of the program is to find an optimization matrix for the lowest cost, given the restrictions of the problem. The algorithms, COMPAT and SWITCH, are devised to solve portions of the problem with the subroutines, CHANGE and MUNKRES. These programs are tried on several different problems, and the different results are compared. Suggestions for future research, particularly using Gilmore's branch and bound technique, are made. (WH)
- Published
- 1973
4. A review on different algorithms to find feasible solution for multiple travelling salesman problem.
- Author
-
Deshpande, Archana A., Raut, Seema, and Vaidya, Nalini V.
- Subjects
TRAVELING salesman problem ,COMPUTER science ,GENETIC algorithms ,SALES personnel ,ALGORITHMS ,ANT algorithms - Abstract
The travelling salesperson problem (TSP) is the most studied problem in optimization and computer science. A more challenging version of the Travelling Salesperson Problem (TSP) is the Multiple Travelling Salesperson Problem (MTSP). The goal is to find the shortest path, which is the shortest distance for each salesperson to travel from the depot to each city and return to depot. It is a NP hard problem and has various applications in routing and scheduling. There are various algorithms to solve MTSP such as exact solution method, bioinspired algorithms like genetic ant colony, evolutionary. In this paper we review exact algorithm and bioinspired algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. APPLYING EQUIVALENCE ALGORITHMS IN SOLVING PATTERN MATCHING PROBLEMS. CASE STUDY FOR EXPERT SYSTEM DESIGN.
- Author
-
Andreica, Alina
- Subjects
ALGORITHMS ,SYSTEMS design ,COMPUTER science ,ELECTRONIC data processing ,MATHEMATICAL equivalence - Abstract
The paper focuses on means of applying simplification and equivalence algorithms in solving pattern matching problems. This solution is very useful in designing expert type systems, bringing important advantages in software design. We apply these principles in designing an expert system for plant therapy and we discuss general application principles, revealing their advantages in software design. [ABSTRACT FROM AUTHOR]
- Published
- 2016
6. SMOUTE:Synthetics Minority Over-sampling and Under-sampling TEchniques for class imbalanced problem.
- Author
-
Panote Songwattanasiri and Krung Sinapiromsaran
- Subjects
STATISTICAL sampling ,ALGORITHMS ,MATRICES (Mathematics) ,COMPUTER science ,ELECTRONIC data processing - Abstract
SMOTE is an over-sampling technique for handling a class imbalanced problem. It improves the precision measure of the minority class prediction by generating more minority class instances near the existing ones. Nevertheless, the large number of synthesized minority class instances may outweigh majority class instances. In this paper, we introduce the mixture techniques of over-sampling by SMOTE and under-sampling by reduction around centroids. Our algorithm, Synthetic Minority Over-Sampling and Under-sampling TEchnique called SMOUTE, avoids synthesizing a large number of minority class instances while balances both class instances. We perform experiments based on three classifiers, C4.5, Naïve Bayes and multilayer perceptron. Our results show that classifiers using SMOUTE are correctly grouped the minority class better than SMOTE. Moreover, the speed of SMOUTE is much faster than that of SMOTE for large datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
7. Measuring Optimal Connections in Large Networks.
- Author
-
Yang, Song and Hexmoor, Henry
- Subjects
JAVA programming language ,INFORMATION technology ,COMPUTER science ,SOCIAL networks ,ALGORITHMS - Abstract
Researchers have identified several problems in measuring the strongest path connecting pairs of actors in valued graphs. To address these problems, it has been proposed that average path value (APV) be used to indicate optimal connections between dyads. However, a lack of a proper computer algorithm and its implementation has hindered a wide-range application of the proposed APV solution. In this paper we develop a computer algorithm and fully implement it with four JAVA programs, which are available on request. These programs produce an optimal connection matrix, which is subsequently input into UCINET for further multidimensional scaling and clustering analysis. We demonstrate this procedure with a data matrix containing 38 organizations in information technology. We discuss the methodological implications of the application of our algorithm to future social network studies. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
8. 5th International Symposium on Intelligent Systems and Algorithms, ISA 2018.
- Author
-
Kotyrba, Martin, Volna, Eva, and Zelinka, Ivan
- Subjects
SYSTEMS on a chip ,ALGORITHMS ,SCIENTIFIC literature ,TECHNOLOGY ,SOFT computing ,COMPUTER science - Published
- 2019
- Full Text
- View/download PDF
9. RECENT COMPUTATIONAL INTELLIGENCE DEVELOPMENTS: TECHNICAL AND SOCIAL ASPECTS.
- Author
-
MOLOIU, Alexandra Ştefania, ALBEANU, Grigore, and POPENTIU-VLADICESCU, Florin
- Subjects
COMPUTATIONAL intelligence ,PATTERN recognition systems ,REINFORCEMENT learning ,SUPERVISED learning ,COMPUTER science ,ALGORITHMS ,HUMAN facial recognition software - Abstract
Computational Intelligence (CI) is a dynamic field of research, covering a large range of processing methods and providing value to many real life applications. From fuzzy reasoning, through evolutionary computation to machine learning (ML), important CI paradigms proved value in science and technology. However, learning from data through examples, as a ML way of thinking, is the main characteristic of the new smart systems. Old systems used pre-programmed rules to help a decisional process mainly human centred. Machine learning is developed taking into account methods from computer science, statistics, and data science. Learning from data is possible by processes of supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, transduction, and learning to learn (inductive). Mainly, mathematical support is based on optimisation techniques for neural network training. Matlab, Python, and various frameworks belong to technological support for implementation and testing. This paper gives a broad view on recent computational intelligence algorithms and presents some appropriate case studies covering character recognition, electrocardiogram (ECG) analysis for individual identification, and robust face recognition. Deep learning approach is used for robust face recognition. Experiments with some multilayer neural network architectures are presented. The performance of such identification systems depends not only on the method used for feature extraction but also on the type of matching algorithm, and the interval of time allocated for training. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
10. USING SELECTION ATTRIBUTE ALGORITHMS FROM DATA MINING TO COMPLEMENT THE REP-INDEX.
- Author
-
Vivian, Gláucio R., Cervi, Cristiano R., and Rovadosky, Diogo N.
- Subjects
DATA mining ,RESEARCH management ,ALGORITHMS ,SCIENTOMETRICS ,COMPUTER science - Abstract
In this paper we propose a complement to the Rep-Index reputation metric for researchers, using data mining techniques specific for selecting a subset of the most relevant elements to Rep-Index. We redid the experiments with the original author's data and found the best algorithm. At the end we propose the new weights for classification of researchers with correct classification rate higher than the original Rep-Index. For Computer Science the best selection attribute is InfoGain. The Spearman's correlation is 0.5520 against 0.3932 for Rep-Index with original weights. The results show that Rep-Index is adaptable, flexible and the approach proposal complements the original index to offer better classification for researchers. [ABSTRACT FROM AUTHOR]
- Published
- 2016
11. A new fast convex hull algorithm based on the rectangular segmentation.
- Author
-
Cheng-Yi Ou, Xiang-Sha Liu, Jun-Yan Ma, Xiao-Ping Liao, and Feng-Ying Long
- Subjects
ALGORITHMS ,COMPUTER science ,COMPUTING platforms ,CONVEX domains ,COMPUTATIONAL geometry ,EDUCATION - Published
- 2015
12. Partial Reconfiguration and Novel Algorithm for Matrix Inverse Computations.
- Author
-
Mbe Mbock, Etienne Aubin
- Subjects
MATRIX inversion ,MATRICES (Mathematics) ,ALGORITHMS ,MATHEMATICAL optimization ,COMPUTER science - Abstract
Partial reconfiguration of algorithms is becoming increasingly attractive in many computational applications. This research article covers two aspects of the reconfiguration approach: The first aspect shows that partial reconfiguration is capable of reconstructing computations. The second aspect will construct a theoretical hardware device that realises these computations. With this research article, we analyse the importance of partial reconfiguration for algorithms in one hand and in the second hand we use and apply this concept for the invention of a method that computes two matrices that are inverses of each other. In this paper we specify the computation of two inverse upper and lower matrices using the partial dynamic reconfigurabilty concept. We propose for this novel algorithm a pseudo code implementation and its hardware construction. [ABSTRACT FROM AUTHOR]
- Published
- 2013
13. FROM THE ALGORITHMIC THINKING TO THE CONCEPT OF PROJECT.
- Author
-
VLADA, Marin
- Subjects
ALGORITHMS ,PROBLEM solving ,SCIENTIFIC knowledge ,INFORMATION technology ,COMPUTER algorithms ,ELECTRONIC data processing personnel - Abstract
This article presents several important topics that show the importance of algorithms and solving problems in the development of scientific knowledge. In the problem-solving processes require demonstrative thinking, an algorithms thinking. This paper presents a modern approach to the concept of algorithm and explains in detail how it would be seen (by students, teachers, and specialists) algorithm taking into account various aspects that are virtually disregarded the classical approach. The concept of a dynamic evolution algorithm was determined by the competence and experience of IT professionals in the activity of solving problems using computer. Developed and developing algorithms represented the implementation of various projects. Projects in all fields have led to developing society. [ABSTRACT FROM AUTHOR]
- Published
- 2011
14. An Efficient Coverage Control Algorithm in MANET.
- Author
-
Young-jun Oh and Kang-whan Lee
- Subjects
AD hoc computer networks ,ALGORITHMS ,COMPUTER science - Published
- 2015
15. A Differential Testing Approach for Evaluating Abstract Syntax Tree Mapping Algorithms.
- Author
-
Yuanrui Fan, Xin Xia, Lo, David, Hassan, Ahmed E., Yuan Wang, and Shanping Li
- Subjects
ALGORITHMS ,JAVA programming language ,COMPUTER science ,SOFTWARE engineers ,ARTIFICIAL intelligence - Abstract
syntax tree (AST) mapping algorithms are widely used to analyze changes in source code. Despite the foundational role of AST mapping algorithms, little effort has been made to evaluate the accuracy of AST mapping algorithms, i.e., the extent to which an algorithm captures the evolution of code. We observe that a program element often has only one best-mapped program element. Based on this observation, we propose a hierarchical approach to automatically compare the similarity of mapped statements and tokens by different algorithms. By performing the comparison, we determine if each of the compared algorithms generates inaccurate mappings for a statement or its tokens. We invite 12 external experts to determine if three commonly used AST mapping algorithms generate accurate mappings for a statement and its tokens for 200 statements. Based on the experts' feedback, we observe that our approach achieves a precision of 0.98-1.00 and a recall of 0.65-0.75. Furthermore, we conduct a large-scale study with a dataset of ten Java projects containing a total of 263,165 file revisions. Our approach determines that GumTree, MTDiff and IJM generate inaccurate mappings for 20%-29%, 25%-36% and 21%-30% of the file revisions, respectively. Our experimental results show that state-of-the-art AST mapping algorithms still need improvements [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
16. Symbolic Repairs for GR(1) Specifications.
- Author
-
Maoz, Shahar, Ringert, Jan Oliver, and Shalom, Rafi
- Subjects
SOFTWARE engineering ,SOFTWARE engineers ,ARTIFICIAL intelligence ,COMPUTER science ,ALGORITHMS - Abstract
Unrealizability is a major challenge for GR(1), an expressive assume-guarantee fragment of LTL that enables effi- cient synthesis. Some works attempt to help engineers deal with unrealizability by generating counter-strategies or computing an unrealizable core. Other works propose to repair the unrealizable specification by suggesting repairs in the form of automatically generated assumptions. In this work we present two novel symbolic algorithms for repairing unrealizable GR(1) specifications. The first algorithm infers new assumptions based on the recently introduced JVTS. The second algorithm infers new assumptions directly from the specification. Both algorithms are sound. The first is incomplete but can be used to suggest many different repairs. The second is complete but suggests a single repair. Both are symbolic and therefore efficient. We implemented our work, validated its correctness, and evaluated it on benchmarks from the literature. The evaluation shows the strength of our algorithms, in their ability to suggest repairs and in their performance and scalability compared to previous solutions. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
17. Statistical Algorithmic Profiling for Randomized Approximate Programs.
- Author
-
Joshi, Keyur, Fernando, Vimuth, and Misailovic, Sasa
- Subjects
ALGORITHMS ,COMPUTER science ,COMPUTER software development ,ARTIFICIAL intelligence ,MOBILE apps - Abstract
Many modern applications require low-latency processing of large data sets, often by using approximate algorithms that trade accuracy of the results for faster execution or reduced memory consumption. Although the algorithms provide probabilistic accuracy and performance guarantees, a software developer who implements these algorithms has little support from existing tools. Standard profilers do not consider accuracy of the computation and do not check whether the outputs of these programs satisfy their accuracy specifications. We present AXPROF, an algorithmic profiling framework for analyzing randomized approximate programs. The developer provides the accuracy specification as a formula in a mathematical notation, using probability or expected value predicates. AXPROF automatically generates statistical reasoning code. It first constructs the empirical models of accuracy, time, and memory consumption. It then selects and runs appropriate statistical tests that can, with high confidence, determine if the implementation satisfies the specification. We used AXPROF to profile 15 approximate applications from three domains -- data analytics, numerical linear algebra, and approximate computing. AXPROF was effective in finding bugs and identifying various performance optimizations. In particular, we discovered five previously unknown bugs in the implementations of the algorithms and created fixes, guided by AXPROF. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
18. Preface.
- Subjects
COMPUTER science ,MATHEMATICS education ,ALGORITHMS - Published
- 2016
19. PARALLEL GENETIC ALGORITHM FOR SAT PROBLEMS BASED ON THE COARSE-GRAINED MODEL.
- Author
-
GUANFENG WU, YANG XU, WENJING CHANG, and XIAOMEI ZHONG
- Subjects
GENETIC algorithms ,HEURISTIC ,HEURISTIC algorithms ,ARTIFICIAL intelligence ,COMPUTER science ,ALGORITHMS - Published
- 2016
20. Experiences with CS2 and data structures in the 100 problems format.
- Author
-
Kraft, Nicholas, Hong, Xiaoyan, Lusth, John C., and McCallum, Debra
- Abstract
A dissatisfaction appears to permeate the process of educating computer science students. Both students and instructors seem uninspired in the classroom, resulting in many attempts to enliven, freshen, and improve the experience. These attempts show efficacy, but the pace of improvement is slow. 100 Problems (100P) is an innovative guided discovery curriculum in which students are freed from the classroom and instead work on 100 concept- and research-related problems throughout their undergraduate careers. The 100 problems guide the students to discover the fundamental knowledge and skills required of a graduate of the degree program. Each student is free to create an individualized mode of learning and discovery. As such, the curriculum fosters deep learning among students and challenges students' intellectual growth. In this paper we introduce the 100P curriculum, describe the 100P course format and our experiences offering courses in this format, and report our early findings. [ABSTRACT FROM PUBLISHER]
- Published
- 2011
- Full Text
- View/download PDF
21. Implicit Media Knowledge Experiments & Results.
- Author
-
Ly, Muy-Chu and Germaneau, Alexis
- Subjects
MASS media ,ALGORITHMS ,CLUSTER analysis (Statistics) ,INDUSTRIAL research ,TECHNOLOGICAL progress ,TECHNOLOGICAL innovations ,COMPUTER science ,INTERNET ,WORLD Wide Web - Abstract
Implicit Media Knowledge aims to provide relevant information related to visual media without effort. It is based on the analysis of media usage from several users (e.g. a community). Algorithms based on clustering methods that extract relevant information (e.g. tags, taxonomy trees) related to a media from its usage are detailed. To validate our new approach, we propose to apply our concept and algorithms on a specific media use such as the analysis of how multiple users organize their media files. Significant results of two experiments will be highlighted. Perspectives of our work will be finally presented. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
22. Using Resource Competition Strategy to Achieve Distributive Scheduling in the Grid and Cloud Computing Environments.
- Author
-
Chiou, Andy S. and Wen-Jiun Lin
- Subjects
ALGORITHMS ,CLOUD computing ,GRID computing ,DISTRIBUTED computing ,COMPUTER science - Abstract
The emerging Cloud and Grid computing have become the de facto distributed computation. These environments are dynamic in the sense that the arrivals of user requests are unpredictable and due to this versatility, traditional scheduling algorithms fail to match user requests to the most feasible processing elements because no prior knowledge about the requests is available until they arrive at the schedulers. The situation is getting worse particularly in a large scaled system, in which the single scheduler is often the bottleneck that the queue is congested with a large amount of user requests. One of the solutions to this problem is to use multiple schedulers. However, the scheduling complexity can rise dramatically since a scheduler must consult others and maintain a global state of all processing elements in order to generate a feasible plan. Sometimes the imposed costs of synchronizing the schedulers and processing elements can turn into the major cause of system degradation. In this research, we take the resource competition approach in which the schedulers compete for the resources without acknowledging other competitors and the processing elements accept only the first request arrives and reject the others. We conduct thorough simulations and the results are positive. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
23. AN EFFICIENT LINEAR-TIME CLUSTERING ALGORITHMS.
- Author
-
WANG Li, ZANG Liangjun, and SONG Renfeng
- Subjects
DATA mining ,ALGORITHMS ,LINEAR systems ,PARALLEL algorithms ,COMPUTER science - Published
- 2005
24. Timetabling Highly Constrained System via Genetic Algorithms.
- Author
-
LEE, HO SUNG C. and HERMOSILLA, AUGUSTO Y.
- Subjects
GENETIC algorithms ,HEURISTIC programming ,COMPUTER science ,ALGORITHMS ,COMPUTER programming - Published
- 2002
- Full Text
- View/download PDF
25. Pragmatic Algorithmic Game Theory.
- Author
-
LEYTON-BROWN, KEVIN
- Subjects
GAME theory ,ALGORITHMS ,COMPUTER science ,MICROECONOMICS ,PRAGMATISM - Abstract
Algorithmic Game Theory (AGT) studies problems at the interface between computer science and microeconomics. Research in this area typically attacks very general settings using theoretical tools. There are great advantages to such an approach: in particular, the field has amassed an impressive range of sweeping impossibility, optimality, and approximation results. However, sometimes it is very difficult to obtain a clean theoretical result that addresses a complex, real-world problem of interest. We can often say more about realistic problems if we're willing to be pragmatic. In particular, progress can often be made by leveraging one or both of the following forms of pragmatism: 1. Aiming to achieve good performance only on problems of interest, rather than in relatively unconstrained settings; and 2. Working with statistical rather than analytical tools, thereby defining problems of interest implicitly via a dataset and/or appealing to data-driven measures of performance. This talk will survey three broad problems that I have attacked using such pragmatic approaches in my own research. (There is, of course, a wide range of excellent work in this vein by others; however, surveying it is beyond the scope of this talk.) First, a central problem in AGT is reasoning about equilibrium behavior in games. A seminal result was that the identification of a sample Nash equilibrium is PPAD-complete even in two-player games, meaning that the problem is probably intractable in the worst case. However, it is nevertheless often still possible to reason about games of interest, because they often exhibit various structural regularities. I'll describe the Action Graph Game (AGG) formalism, and explain how restricting ourselves to games that are compact in this (general) language yields exponential performance improvements. AGGs are particularly interesting for AGT researchers because they can compactly model messy, realistic mechanisms like advertising auctions or voting systems. I'll discuss what can be gained by analyzing such mechanisms in this way, and introduce some general tools that researchers can use to easily leverage these techniques in their own work. A second problem at the core of AGT is market design. I'll focus here on the FCC's upcoming "incentive auction", in which television broadcasters will be given the opportunity to sell their broadcast rights, remaining broadcasters will be repacked into a smaller block of spectrum, and the freed airwaves will be resold to telecom companies. The stakes for this auction are huge-projected tens of billions of dollars in revenue for the government-justifying the design of a special-purpose descending-price auction mechanism, which I'll describe. An inner-loop problem in this mechanism is determining whether a given set of broadcasters can be repacked into a smaller block of spectrum while respecting radio interference constraints. This is an instance of a (worst-case intractable) graph coloring problem; however, stations' broadcast locations and interference constraints are all known in advance. I'll explain how this information can be leveraged by machine learning methods to obtain a SAT-based algorithm that can very quickly solve nearly all repacking problems encountered in practice. Finally, I'll briefly discuss Kudu, a market for agricultural commodities that my collaborators and I have built to help smallhold farmers in Uganda. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.