159 results on '"Data structures -- Research"'
Search Results
2. Data structures for higher-dimensional rectilinear packing
- Author
-
Allen, Sam D. and Burke, Edmund K.
- Subjects
Data structures -- Research ,Mathematical optimization ,Algorithms -- Research ,Algorithm ,Computers ,Science and technology - Abstract
TT his paper presents an abstract data type (ADT) for producing higher-dimensional rectilinear packings using 1 constructive methods, the Skyline ADT. Furthermore, a novel method and several approaches from the [...]
- Published
- 2012
3. Metadata elements
- Author
-
Coyle, Karen
- Subjects
Data structures -- Research ,Library science -- Research ,Online services -- Usage ,Cable television/data services ,Online services ,Library and information science - Abstract
Abstract Chapter 3 discusses metadata elements, the building blocks of any metadata scheme. Metadata elements have been defined using Semantic Web standards, which form a universal pool from which anyone [...]
- Published
- 2012
4. WYSINWYX: what you see is not what you execute
- Author
-
Balakrishnan, Gogul and Reps, Thomas
- Subjects
Data structures -- Research ,Reverse engineering -- Research ,Semantics -- Research - Abstract
Over the last seven years, we have developed static-analysis methods to recover a good approximation to the variables and dynamically allocated memory objects of a stripped executable, and to track the flow of values through them. The article presents the algorithms that we developed, explains how they are used to recover Intermediate Representations (IRs) from executables that are similar to the IRs that would be available if one started from source code, and describes their application in the context of program understanding and automated bug hunting. Unlike algorithms for analyzing executables that existed prior to our work, the ones presented in this article provide useful information about memory accesses, even in the absence of debugging information. The ideas described in the article are incorporated in a tool for analyzing Intel x86 executables, called CodeSurfer/x86. CodeSurfer/x86 builds a system dependence graph for the program, and provides a GUI for exploring the graph by (i) navigating its edges, and (ii) invoking operations, such as forward slicing, backward slicing, and chopping, to discover how parts of the program can impact other parts. To assess the usefulness of the IRs recovered by CodeSurfer/x86 in the context of automated bug hunting, we built a tool on top of CodeSurfer/x86, called Device-Driver Analyzer for x86 (DDA/x86), which analyzes device-driver executables for bugs. Without the benefit of either source code or symbol-table/debugging information, DDA/x86 was able to find known bugs (that had been discovered previously by source-code analysis tools), along with useful error traces, while having a low false-positive rate. DDA/x86 is the first known application of program analysis/verification techniques to industrial executables. Categories and Subject Descriptors: D.2.4 [Software Engineering]: Software/Program Verification--Assertion checkers; model checking; D.2.5 [Software Engineering]: Testing and Debugging--Symbolic execution; D.2.7 [Software Engineering]: Distribution, Maintenance, and Enhancement--Restructuring, reverse engineering, and reengineering; D.3.2 [Programming Languages]: Language Classifications--Macro and assembly languages; D.4.6 [Operating Systems]: Security and Protection--Invasive software; E.1 [Data]: Data Structures--Arrays; lists, stacks, and queues; records; F.3.2 [Logics and Meanings of Programs]: Semantics of Programming Languages--Program analysis General Terms: Algorithms, Security, Theory, Verification Additional Key Words and Phrases: Abstract interpretation, context-sensitive analysis, data structure recovery, interprocedural dataflow analysis, pointer analysis, reverse engineering, static analysis DOI = 10.1145/1749608.1749612
- Published
- 2010
5. Verifying safety properties of concurrent heap-manipulating programs
- Author
-
Yahav, Eran and Sagiv, Mooly
- Subjects
Algorithm ,Java ,Algorithms -- Research ,Debugging -- Methods ,Data structures -- Research ,Java (Computer program language) -- Usage - Abstract
We provide a parametric framework for verifying safety properties of concurrent heap-manipulating programs. The framework combines thread-scheduling information with information about the shape of the heap. This leads to verification algorithms that are more precise than existing techniques. The framework also provides a precise shape-analysis algorithm for concurrent programs. In contrast to most existing verification techniques, we do not put a bound on the number of allocated objects. The framework produces interesting results even when analyzing programs with an unbounded number of threads. The framework is applied to successfully verify the following properties of a concurrent program: --Concurrent manipulation of linked-list based ADT preserves the ADT datatype invariant. --The program does not perform inconsistent updates due to interference. --The program does not reach a deadlock. --The program does not produce runtime errors due to illegal thread interactions. We also found bugs in erroneous programs violating such properties. A prototype of our framework has been implemented and applied to small, but interesting, example programs. Categories and Subject Descriptors: D.2.5 [Software Engineering]: Testing and Debugging--Symbolic execution; D.3.3 [Programming Languages]: Language Constructs and Features--Data types and structures; dynamic storage management; E.1 [Data]: Data Structures--Graphs; lists; trees; E.2 [Data]: Data Storage Representations--Composite structures; linked representations; F.3.1 [Logics and Meanings of Programs]: Specifying and Verifying and Reasoning about Programs--Assertions; invariants General Terms: Algorithms, Languages, Theory, Verification Additional Key Words and Phrases: Abstract interpretation, verification, concurrency, shape-analysis, safety properties, Java DOI = 10.1145/1745312.1745315
- Published
- 2010
6. Convexity and concavity detection in computational graphs: tree walks for convexity assessment
- Author
-
Fourer, Robert, Maheshwari, Chandrakant, Neumaier, Arnold, Orban, Dominique, and Schichl, Hermann
- Subjects
Data structures -- Research ,Convex domains -- Research ,Graphic methods -- Usage ,Mathematical optimization ,Functional equations -- Research ,Functions -- Research ,Computers ,Science and technology - Abstract
We examine symbolic tools associated with two modeling systems for mathematical programming, which can be used to automatically detect the presence or absence of convexity and concavity in the objective [...]
- Published
- 2010
- Full Text
- View/download PDF
7. NV-tree: an efficient disk-based index for approximate search in very large high-dimensional collections
- Author
-
Lejsek, Herwig, Asmundsson, Fridrik Heidar, Jonsson, Bjorn Por, and Amsaleg, Laurent
- Subjects
10GB - 14.99GB hard disk drive ,15GB - 19.99GB hard disk drive ,20GB - 25GB hard disk drive ,5GB - 9.99GB hard disk drive ,Hard disk drive ,Over 25GB hard disk drive ,Under 5GB hard disk drive ,Data structures -- Research ,Hard disks -- Research ,Image processing -- Analysis ,Indexes -- Research - Abstract
Over the last two decades, much research effort has been spent on nearest neighbor search in high-dimensional data sets. Most of the approaches published thus far have, however, only been tested on rather small collections. When large collections have been considered, high-performance environments have been used, in particular systems with a large main memory. Accessing data on disk has largely been avoided because disk operations are considered to be too slow. It has been shown, however, that using large amounts of memory is generally not an economic choice. Therefore, we propose the NV-tree, which is a very efficient disk-based data structure that can give good approximate answers to nearest neighbor queries with a single disk operation, even for very large collections of high-dimensional data. Using a single NV-tree, the returned results have high recall but contain a number of false positives. By combining two or three NV-trees, most of those false positives can be avoided while retaining the high recall. Finally, we compare the NV-tree to Locality Sensitive Hashing, a popular method for [epsilon]-distance search. We show that they return results of similar quality, but the NV-tree uses many fewer disk reads. Index Terms--High-dimensional indexing, multimedia indexing, very large databases, approximate searches.
- Published
- 2009
8. Fast generation of three-dimensional video holograms by combined use of data compression and lookup table techniques
- Author
-
Kim, Seung-Cheol, Yoon, Jung-Hoon, and Kim, Eun-Soo
- Subjects
Data structures -- Research ,Holography -- Research ,Three-dimensional display systems -- Research ,Video recordings -- Methods ,Data compression -- Methods ,3D technology ,Astronomy ,Physics - Abstract
Even though there are many types of methods to generate CGH (computer-generated hologram) patterns of three-dimensional (3D) objects, most of them have been applied to still images but not to video images due to their computational complexity in applications of 3D video holograms. A new method for fast computation of CGH patterns for 3D video images is proposed by combined use of data compression and lookup table techniques. Temporally redundant data of the 3D video images are removed with the differential pulse code modulation (DPCM) algorithm, and then the CGH patterns for these compressed videos are generated with the novel lookup table (N-LUT) technique. To confirm the feasibility of the proposed method, some experiments with test 3D videos are carried out, and the results are comparatively discussed with the conventional methods in terms of the number of object points and computation time. OCIS codes: 090.1760, 090.5694, 050.1940, 100.6890.
- Published
- 2008
9. Decoupled 3-D zerotree structure for wavelet-based video coding
- Author
-
Zhang, Liang, Wang, Demin, and Vincent, Andre
- Subjects
Algorithms -- Usage ,Image coding -- Methods ,Data structures -- Research ,Wavelet transforms -- Usage ,Algorithm ,Business ,Electronics ,Mass communications - Abstract
A novel data structure is proposed for magnitude-ordering 3-D wavelet transform coefficients. This data structure consists of temporal 1-D orientation trees followed by spatial 2-D orientation trees. Based on this data structure, a coding algorithm is developed for the embedded coding of 3-D wavelet transform coefficients. Experiment results confirm that the proposed coding algorithm outperforms a conventional algorithm that uses asymmetric 3-D orientation trees. Index Terms--Embedded coding, set partitioning in hierarchical trees (SPIHT), video coding, wavelet transform.
- Published
- 2008
10. Tracing information flow on a global scale using Internet chain-letter data
- Author
-
Liben-Nowell, David and Kleinberg, Jon
- Subjects
Online social networks -- Research ,Algorithms -- Methods ,Chain letters -- Research ,Data structures -- Research ,Algorithm ,Science and technology - Abstract
Although information, news, and opinions continuously circulate in the worldwide social network, the actual mechanics of how any single piece of information spreads on a global scale have been a mystery. Here, we trace such information-spreading processes at a person-by-person level using methods to reconstruct the propagation of massively circulated Internet chain letters. We find that rather than fanning out widely, reaching many people in very few steps according to 'small-world' principles, the progress of these chain letters proceeds in a narrow but very deep tree-like pattern, continuing for several hundred steps. This suggests a new and more complex picture for the spread of information through a social network. We describe a probabilistic model based on network clustering and asynchronous response times that produces trees with this characteristic structure on social-network data. social networks | algorithms | epidemics | diffusion in networks
- Published
- 2008
11. Bounded-collision memory-mapping schemes for data structures with applications to parallel memories
- Author
-
Cordasco, Gennaro, Scarano, Vittorio, and Rosenberg, Arnold L.
- Subjects
Data structures -- Research ,Memory (Computers) -- Design and construction ,Parallel computers -- Usage ,Semiconductor memory ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
Mapping structured data technique is applied to parallel memory modules, limiting the number of conflicts. The technique allows simultaneous access to distinct processors of the same memory modules.
- Published
- 2007
12. (Almost) tight bounds and existence theorems for single-commodity confluent flows
- Author
-
Chen, Jiangzhou, Kleinberg, Robert D., Laszlo, Lovasz, Rajaraman, Rajmohan, Sundaram, Ravi, and Vetta, Adrian
- Subjects
Existence theorems -- Research ,Branch and bound algorithms -- Research ,Network analysis (Planning) -- Research ,Data structures -- Research ,Approximation theory -- Research - Published
- 2007
13. Dynamic segment trees for ranges and prefixes
- Author
-
Yeim-Kuan Chang and Yung-Chieh Lin
- Subjects
Semiconductor memory ,Data structures -- Research ,Binary trees (Computers) -- Analysis ,Memory (Computers) -- Usage - Published
- 2007
14. Scale-changing technique for the electromagnetic modeling of MEMS-controlled planar phase shifters
- Author
-
Perret, Etienne, Aubert, Herve, and Legay, Herve
- Subjects
Arrays (Data structures) -- Research ,Arrays (Data structures) -- Usage ,Data structures -- Research ,Microelectromechanical systems -- Research ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
A scale changing approach is proposed for the electromagnetic modeling of phase-shifter elements used in reconfigurable microelectromechanical system (MEMS)-controlled reflectarrays. Based on the partition of the discontinuity plane in planar sub-domains with various scale levels, this technique allows the computation of the phase shift from the simple cascade of networks, each network describing the electromagnetic coupling between two scale levels. The high flexibility of the approach associated with the advantages of the integral equations formulations renders this original approach powerful and rapid. The scale-changing technique allows quasi-instantaneous computing of the 1024 phase shifts achieved by ten RF-MEMS switches distributed on the phase-shifter surface. Moreover, the proposed approach is much better than the finite-element-method-based software in time costing. Experimental data are given for validation purposes. Index Terms--Multiscale structures, planar phase shifter, reflectarrays, RF microelectromechanical systems (MEMS).
- Published
- 2006
15. Bayesian measures of explained variance and pooling in multilevel (Hierarchical) models
- Author
-
Gelman, Andrew and Pardoe, Iain
- Subjects
Data structures -- Research ,Regression analysis ,Bayesian statistical decision theory ,Engineering and manufacturing industries ,Mathematics ,Science and technology - Abstract
An approach to defining explained variance R(super 2) at each level of the multilevel model, rather than attempting to create a single summary measure of fit is presented. A related variance comparison is discussed to summarize the degree to which estimates at each level of the model are pooled together based on the level-specific regression relationship, rather than estimated separately.
- Published
- 2006
16. Metadata design for reconfigurable protocol stacks in systems beyond 3G
- Author
-
Gazis, Vangelis, Alonistioti, Nancy, and Merakos, Lazaros
- Subjects
Third generation wireless technology -- Research ,Data structures -- Research ,Engineering and manufacturing industries - Published
- 2006
17. Memory retrieval tasks determine the serial position curves of linear orders with categorical structures
- Author
-
Jou, Jerwen
- Subjects
Storage capacity ,Memory compaction -- Analysis ,Memory compaction -- Research ,Memory management -- Analysis ,Memory management -- Research ,Memory mapping -- Analysis ,Memory mapping -- Research ,Memory partitioning -- Analysis ,Memory partitioning -- Research ,Memory protection -- Analysis ,Memory protection -- Research ,Memory refresh (Computers) -- Analysis ,Memory refresh (Computers) -- Research ,Data structures -- Analysis ,Data structures -- Research - Abstract
I examined 2 different views on organization of lineal order information in memory: the linear schema view and the hierarchical structure view. The linear schema view holds that there is a strong tendency in memory to organize an array of transitively related elements into a unidimensional order. The hierarchical structure view maintains that the transitively related elements are organized in memory in a hierarchy of items. I proposed an input structure and retrieval compatibility hypothesis as a coherent explanation for file contradictory views on the memory organization of serial elements. I argue that the organization of linear order information in memory is determined by the nature of the memory retrieval task used. For example, a comparative judgment task is more compatible with a unidimensional structure, whereas a sequential recall or serial position identification task is more compatible with a hierarchical organization. The input structure and retrieval compatibility concept can also explain why dichotomous categories imposed on a linear order play very little role in comparative judgments.
- Published
- 2005
18. MICA-Graph: a tool for managing text and sketches during design processes
- Author
-
Gardoni, M., Blanco, E., and Ruger, S.
- Subjects
Concurrent engineering -- Research ,Data structures -- Research ,Graphic methods -- Usage ,Graphic methods -- Analysis ,Work groups -- Computer programs ,Work groups -- Design and construction ,Work groups -- Research ,Workgroup software ,Engineering and manufacturing industries - Abstract
Byline: M. Gardoni (1), E. Blanco (1), S. Ruger (2) Keywords: Index terms; Concurrent Engineering; non-structured information; Information structuring; groupware; MICA-Graph tool; knowledge; know-how; sketch design retrieval Abstract: Concurrent Engineering (CE) approaches depend heavily on efficient and reliable communication between the people involved in the design, engineering, industrialisation and manufacturing of products. To address this need, we propose an approach to facilitate joint solutions to product design problems within a CE integrated team by supporting textual and graphical information in the form of sketches. We introduce a new and ergonomic groupware tool working on a network of PCs: MICA-Graph. This involves the exchange of technical messages (textual and graphical) and capitalisation of relevant knowledge. Our tool supports the description of technical problems, message exchange among users, monitoring of the problem solving process and the documentation of technical solutions. It also provides facilities to structure and archive knowledge and know-how learned from new methods of design sketch retrieval. The tool has been developed and evaluated, primarily to feed back manufacturing knowledge to design teams in the context of product change requests. Author Affiliation: (1) Laboratoire GILCO, INPG-ENSGI, 46, avenue Felix Viallet, F-38031, Grenoble, cedex 1, France (2) Department of Computing, Imperial College London, South Kensington Campus, GB-London, SW7 2AZ, UK Article History: Registration Date: 30/04/2005
- Published
- 2005
19. A fast approximate string searching algorithm
- Author
-
Mhashi, Mahmoud, Rawashdeh, Adnan, and Hammouri, Awni
- Subjects
Algorithms -- Research ,Data structures -- Research ,Algorithm ,Computers - Abstract
Abstract: In both approximate and exact string searching algorithms, the shift distance at the skipping step plays a major role in the performance of string matching algorithms. A new algorithm [...]
- Published
- 2005
20. Informational asymmetries and observational learning in search
- Author
-
Einav, Liran
- Subjects
Data structures -- Research ,Learning -- Research ,Business, general - Published
- 2005
21. A framework for analyzing levels of analysis issues in studies of e-collaboration
- Author
-
Gallivan, Michael J. and Benbunan-Fich, Raquel
- Subjects
Information storage and retrieval systems -- Models ,Data structures -- Research ,Decision support systems -- Models ,Decision support software ,Business ,High technology industry ,Literature/writing - Abstract
The range of multilevel issues that can be encountered in research on the use of synchronous or asynchronous group support systems is investigated. Concepts of levels of analysis from the management literature are introduced and all empirical studies of e-collaboration from seven information systems journals for the period 1999-2003 are examined.
- Published
- 2005
22. Data structures and ejection chains for solving large-scale traveling salesman problems
- Author
-
Gamboa, Dorabela, Rego, Cesar, and Glover, Fred
- Subjects
Branch and bound algorithms -- Methods ,Traveling-salesman problem -- Analysis ,Data structures -- Research ,Data structures -- Models ,Business ,Business, general ,Business, international - Abstract
A study is done to make a new data structure to solve traveling sales man problem by improving the efficiency of the stem-and-cycle algorithm. The results indicate an improvement in the two level tree data structure.
- Published
- 2005
23. Efficient minimization and manipulation of linearly transformed binary decision diagrams
- Author
-
Gunther, Wolfgang and Drechsler, Rolf
- Subjects
Decision tables -- Research ,Decision logic tables -- Research ,Data structures -- Research - Published
- 2003
24. Efficient approximation of correlated sums on data streams
- Author
-
Ananthakrishna, Rohit, Das, Abhinandan, Gehrke, Johannes, Korn, Flip, Muthukrishnan, S., and Srivastava, Divesh
- Subjects
Data structures -- Research ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
In many applications such as IP network management, data arrives in streams and queries over those streams need to be processed online using limited storage. Correlated-sum (CS) aggregates are a natural class of queries formed by composing basic aggregates on (x, y) pairs and are of the form SUM {g(y) :x [less than or equal to] f(AGG(x))}, where AGG(x) can be any basic aggregate and f(), g() are user-specified functions. CS-aggregates cannot be computed exactly in one pass through a data stream using limited storage; hence, we study the problem of computing approximate CS-aggregates. We guarantee a priori error bounds when AGG(x) can be computed in limited space (e.g., MIN, MAX, AVG), using two variants of Greenwald and Khanna's summary structure for the approximate computation of quantiles. Using real data sets, we experimentally demonstrate that an adaptation of the quantile summary structure uses much less space, and is significantly faster, than a more direct use of the quantile summary structure, for the same a posteriori error bounds. Finally, we prove that, when AGG(x) is a quantile (which cannot be computed over a data stream in limited space), the error of a CS-aggregate can be arbitrarily large. Index Terms--Correlated aggregates, data streams, approximation, summary structures, a priori error bounds, IP network management.
- Published
- 2003
25. Comparing data streams using Hamming norms (how to zero in)
- Author
-
Cormode, Graham, Datar, Mayur, Indyk, Piotr, and Muthukrishnan, S.
- Subjects
Data structures -- Research ,Query processing ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
Massive data streams are now fundamental to many data processing applications. For example, Internet routers produce large scale diagnostic data streams. Such streams are rarely stored in traditional databases and instead must be processed 'on the fly' as they are produced. Similarly, sensor networks produce multiple data streams of observations from their sensors. There is growing focus on manipulating data streams and, hence, there is a need to identify basic operations of interest in managing data streams, and to support them efficiently. We propose computation of the Hamming norm as a basic operation of interest. The Hamming norm formalizes ideas that are used throughout data processing. When applied to a single stream, the Hamming norm gives the number of distinct items that are present in that data stream, which is a statistic of great interest in databases. When applied to a pair of streams, the Hamming norm gives an important measure of (dis)similarity: the number of unequal item counts in the two streams. Hamming norms have many uses in comparing data streams. We present a novel approximation technique for estimating the Hamming norm for massive data streams; this relies on what we call the '[l.sub.0] sketch' and we prove its accuracy. We test our approximation method on a large quantity of synthetic and real stream data, and show that the estimation is accurate to within a few percentage points. Index Terms--Data stream analysis, approximate query processing, data structures and algorithms, data reduction.
- Published
- 2003
26. A simple algorithm for finding frequent elements in streams and bags
- Author
-
Karp, Richard M., Shenker, Scott, and Papadimitriou, Christos H.
- Subjects
Algorithm ,Algorithms -- Research ,Data structures -- Research - Abstract
We present a simple, exact algorithm for identifying in a multiset the items with frequency more than a threshold [theta]. The algorithm requires two passes, linear time, and space 1/[theta]. The first pass is an on-line algorithm, generalizing a well-known algorithm for finding a majority element, for identifying a set of at most 1/[theta] items that includes, possibly among others, all items with frequency greater than [theta]. Categories and Subject Descriptors: C.2.2 [Computer-Communication Networks]: Network Protocols--database management General Terms: Algorithms, Measurement, Performance Additional Key Words and Phrases: Data stream, frequent elements
- Published
- 2003
27. Cost models for overlapping and multiversion structures
- Author
-
Tao, Yufei, Papadias, Dimitris, and Zhang, Jun
- Subjects
CD-ROM catalog ,CD-ROM database ,Database ,Indexing -- Methods ,Databases -- Research ,Query processing -- Research ,Spatial analysis (Statistics) ,Data structures -- Research ,Database administration -- Research - Abstract
Overlapping and multiversion techniques are two popular frameworks that transform an ephemeral index into a multiple logical-tree structure in order to support versioning databases. Although both frameworks have produced numerous efficient indexing methods, their performance analysis is rather limited; as a result there is no clear understanding about the behavior of the alternative structures and the choice of the best one, given the data and query characteristics. Furthermore, query optimization based on these methods is currently impossible. These are serious problems due to the incorporation of overlapping and multiversion techniques in several traditional (e.g., financial) and emerging (e.g., spatiotemporal) applications. In this article, we reduce performance analysis of overlapping and multiversion structures to that of the corresponding ephemeral structures, thus simplifying the problem significantly. This reduction leads to accurate cost models that predict the sizes of the trees, the node/page accesses, and selectivity of queries. Furthermore, the models offer significant insight into the behavior of the structures and provide guidelines about the selection of the most appropriate method in practice. Extensive experimentation proves that the proposed models yield errors below 5 and 15% for uniform and nonuniform data, respectively. Categories and Subject Descriptors: H.3.1 [Information Storage and Retrieval]: Content Analysis and Indexing--indexing methods General Terms: Theory Additional Key Words and Phrases: Database, temporal, spatiotemporal, index, overlapping and multiversion structures
- Published
- 2002
28. An analytical POC stack operations folding for continuous and discontinuous Java bytecodes.
- Author
-
Ton, Lee-Ren, Chang, Lung-Chung, and Chung, Chung-Ping
- Subjects
Computer industry ,Microcomputer industry ,Java ,Oracle America Inc. -- Research ,Computer industry -- Research ,Data structures -- Research ,Java (Computer program language) -- Research ,Virtual computer systems -- Research - Published
- 2002
29. Parametric shape analysis via 3-valued logic
- Author
-
Sagiv, Mooly, Reps, Thomas, and Wilhelm, Reinhard
- Subjects
Software engineering -- Research ,Programming languages -- Usage ,Verification (Logic) -- Usage ,Pointers (Computers) -- Usage ,Form perception -- Research ,Data structures -- Research ,Computer storage devices -- Research - Abstract
Shape analysis concerns the problem of determining "shape invariants" for programs that perform destructive updating on dynamically allocated storage. This article presents a parametric framework for shape analysis that can be instantiated in different ways to create different shape-analysis algorithms that provide varying degrees of efficiency and precision. A key innovation of the work is that the stores that can possibly arise during execution are represented (conservatively) using 3-valued logical structures. The framework is instantiated in different ways by varying the predicates used in the 3-valued logic. The class of programs to which a given instantiation of the framework can be applied is not limited a priori (i.e., as in some work on shape analysis, to programs that manipulate only lists, trees, DAGS, etc.); each instantiation of the framework can be applied to any program, but may produce imprecise results (albeit conservative ones) due to the set of predicates employed. Categories and Subject Descriptors: D.2.5 [Software Engineering]: Testing and Debugging--symbolic execution; D.3.3 [Programming Languages]: Language Constructs and Features--data types and structures; dynamic storage management; E.1 [Data]: Data Structures--graphs; lists; trees; E.2 [Data]: Data Storage Representations--composite structures; linked representations; F.3.1 [Logics and Meanings of Programs]: Specifying and Verifying and Reasoning about Programs--assertions; invariants General Terms: Algorithms, Languages, Theory, Verification Additional Key Words and Phrases: Abstract interpretation, alias analysis, constraint solving, destructive updating, pointer analysis, shape analysis, static analysis, 3-valued logic
- Published
- 2002
30. Burst tries: a fast, efficient data structure for string keys
- Author
-
Heinz, Steffen, Zobel, Justin, and Williams, Hugh E.
- Subjects
Database administration -- Research ,Information storage and retrieval systems -- Research ,Binary trees (Computers) -- Research ,Data structures -- Research - Abstract
Many applications depend on efficient management of large sets of distinct strings in memory. For example, during index construction for text databases a record is held for each distinct word in the text, containing the word itself and information such as counters. We propose a new data structure, the burst trie, that has significant advantages over existing options for such applications: it uses about the same memory as a binary search tree; it is as fast as a trie; and, while not as fast as a hash table, a burst trie maintains the strings in sorted or near-sorted order. In this paper we describe burst tries and explore the parameters that govern their performance. We experimentally determine good choices of parameters, and compare burst tries to other structures used for the same task, with a variety of data sets. These experiments show that the burst trie is particularly effective for the skewed frequency distributions common in text collections, and dramatically outperforms all other data structures for the task of managing strings while maintaining sort order. Categories and Subject Descriptors: E.1 [Data]: Data Structures; H.3.1 [Information Storage and Retrieval]: Content Analysis and Indexing General Terms: Algorithms Additional Key Words and Phrases: Binary trees, splay trees, string data structures, text databases, tries, vocabulary accumulation
- Published
- 2002
31. Finding localized associations in market basket data
- Author
-
Aggarwal, Charu C., Procopiuc, Cecilia, and Yu, Philip S.
- Subjects
Data capture devices -- Research ,Data structures -- Research ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
In this paper, we discuss a technique for discovering localized associations in segments of the data using clustering. Often, the aggregate behavior of a data set may be very different from localized segments. In such cases, it is desirable to design algorithms which are effective in discovering localized associations because they expose a customer pattern which is more specific than the aggregate behavior. This information may be very useful for target marketing. We present empirical results which show that the method is indeed able to find a significantly larger number of associations than what can be discovered by analysis of the aggregate data. Index Terms--Association rules, clustering, market basket data.
- Published
- 2002
32. Comparing software prediction techniques using simulation
- Author
-
Shepperd, Martin and Kadoda, Gada
- Subjects
Software engineering -- Research ,Data structures -- Research ,Machine learning -- Research ,Computer software industry -- Research - Published
- 2001
33. Fully dynamic maintenance of k-connectivity in parallel
- Author
-
Liang, Weifa, Brent, Richard P., and Shen, Hong, Chinese educator
- Subjects
Algorithms -- Research ,Parallel processing -- Research ,Data structures -- Research ,Graph theory -- Research ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
The design of fully dynamic NC algorithms for maintaining the k-connectivity of graphs is examined. For problems considered, there is a gap between their best sequential time complexity and the work needed in parallel.
- Published
- 2001
34. Data relation vectors: a new abstraction for data optimizations
- Author
-
Kandemir, Mahmut and Ramanujam, J.
- Subjects
Data structures -- Research ,Compiling (Electronic computers) -- Research ,Memory management -- Research - Published
- 2001
35. Bit-parallel systolic multipliers for GF(2m) fields defined by all-one and equally spaced polynomials
- Author
-
Lee, Chiou-Yng, Lu, Erl-Huei, and Lee, Jau-Yien
- Subjects
Algorithms -- Research ,Computers -- Research ,Data structures -- Research ,Modular arithmetic -- Research ,Parallel programming (Computer science) -- Research ,Very-large-scale integration -- Research - Published
- 2001
36. A fast algorithm for multiplicative inversion in GF(2m) using normal basis
- Author
-
Takagi, Naofumi, Yoshiki, Jun-ichi, and Takagi, Kazuyoshi
- Subjects
Algorithms -- Research ,Computers -- Research ,Data structures -- Research ,Fermat's theorem -- Usage ,Modular arithmetic -- Research - Published
- 2001
37. An efficient data structure for dynamic memory management
- Author
-
Morris Chang, J. and Daugherty, Charles H.
- Subjects
Data structures -- Research ,Memory (Computers) -- Research - Published
- 2000
38. Multisensor fusion for atrial and ventricular activity detection in coronary care monitoring
- Author
-
Hernandez, Alfredo I., Carrault, Guy, Mora, Fernando, Thoraval, Laurent, Passariello, Gianfranco, and Schleich, Jean M.
- Subjects
Biomedical engineering ,Patient monitoring ,Data structures -- Research ,Critical care medicine ,Artificial intelligence -- Research ,Biosensors -- Technology application ,Electrocardiogram ,Biological sciences ,Business ,Computers ,Health care industry - Published
- 1999
39. Improving clinical decision support through case-based data fusion
- Author
-
Azuaje, Francisco, Dubitzky, Werner, Black, Norman, and Adamson, Kenny
- Subjects
Biomedical engineering ,Coronary heart disease -- Risk factors ,Databases -- Usage ,Artificial intelligence -- Research ,Electrocardiogram ,Neural networks -- Usage ,Data structures -- Research ,Biological sciences ,Business ,Computers ,Health care industry - Published
- 1999
40. Fusion of angiography and intravascular ultrasound in vivo: establishing the absolute 3-D frame orientation
- Author
-
Wahle, Andreas, Prause, Guido P. M., Birgelen, Clemens von, Erbel, Raimund, and Sonka, Milan
- Subjects
Biomedical engineering ,Angiography ,Intravascular ultrasonography -- Technology application ,Three-dimensional graphics -- Research ,Intravenous catheterization -- Technology application ,Blood vessels ,Artificial intelligence -- Research ,Data structures -- Research ,Biological sciences ,Business ,Computers ,Health care industry - Published
- 1999
41. Optimal bit allocation and best-basis selection for wavelet packets and TSVQ
- Author
-
Goldschneider, Jill R. and Riskin, Eve A.
- Subjects
Data compression -- Research ,Data structures -- Research ,Tree structures (Computers) -- Research ,Wave packets -- Research ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
A four-dimensional pruned tree-structured vector quantization with an efficient algorithm was used to identify optimal wavelet packet tree (WPT) bit allocation/best-basis selection on the lower hull of the rate-distortion curve. The performance of the proposed quantization method, which is trained on five USC database images that include crowd, couple, man, woman 1 and woman 2, was observed to improve with increasing WPT depth since more levels of decomposition permit more adaptability in signal representation and compression.
- Published
- 1999
42. Automatic compiler-inserted prefetching for pointer-based applications
- Author
-
Luk, Chi-Keung and Mowry, Todd C.
- Subjects
Compilers -- Research ,Cache memory -- Research ,Data structures -- Research ,Multiprocessors -- Research ,Compiling (Electronic computers) -- Research - Published
- 1999
43. Cat: A functional stack-based language
- Author
-
Diggins, Christopher
- Subjects
Programming language ,Programming languages -- Properties ,Data structures -- Research - Published
- 2008
44. A general framework for adaptive processing of data structures
- Author
-
Frasconi, Paolo, Gori, Marco, and Sperduti, Alessandro
- Subjects
Data structures -- Research ,Electronic data processing -- Research ,Machine learning -- Methods ,Pattern recognition -- Methods ,Neural networks -- Usage ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
A structured organization of information is typically required by symbolic processing. On the other hand, most connectionist models assume that data are organized according to relatively poor structures, like arrays or sequences. The framework described in this paper is an attempt to unify adaptive models like artificial neural nets and belief nets for the problem of processing structured information. In particular, relations between data variables are expressed by directed acyclic graphs, where both numerical and categorical values coexist. The general framework proposed in this paper can be regarded as an extension of both recurrent neural networks and hidden Markov models to the case of acyclic graphs. In particular we study the supervised learning problem as the problem of learning transductions from an input structured space to an output structured space, where transductions are assumed to admit a recursive hidden statespace representation. We introduce a graphical formalism for representing this class of adaptive transductions by means of recursive networks, i.e., cyclic graphs where nodes are labeled by variables and edges are labeled by generalized delay elements. This representation makes it possible to incorporate the symbolic and subsymbolic nature of data. Structures are processed by unfolding the recursive network into an acyclic graph called encoding network. In so doing, inference and learning algorithms can be easily inherited from the corresponding algorithms for artificial neural networks or probabilistic graphical model. Index Terms - Graphical models, graphs, learning data structures, problem-solving, recurrent neural networks, recursive neural networks, sequences, syntactic pattern recognition.
- Published
- 1998
45. Object oriented storage of material data for coupled problems
- Author
-
Driesen, Johan, Fransen, Johan, Gersem, Herbert De, Belmans, Ronnie, and Hameyer, Kay
- Subjects
Magnetic fields -- Research ,Finite element method -- Usage ,Data structures -- Research ,Object-oriented programming -- Usage ,Eddy currents (Electric) -- Research ,Business ,Electronics ,Electronics and electrical industries - Abstract
The calculation of coupled field problems in order to obtain realistic numerical models becomes more and more important. Coupling strategies require a flexible data structure to access the required material data. An object oriented data structure offers several features proving to be useful to store these material characteristics. This includes different tensor representations of the characteristic and different mathematical descriptions as tables and formulae. Hence, an easy and robust environment for the CAD-programmer becomes available. The use of these data structures is demonstrated by two typical coupled magneto-thermal and a electro-thermal computations. Index terms - Finite element methods, object oriented programming, motor drives, losses, eddy currents, dielectric losses
- Published
- 1998
46. An object oriented data structure for field analysis
- Author
-
Popescu, Mihai, Munteanu, Irina, Constantin, Cristian-George, and Ioan, Daniel
- Subjects
Electromagnetic fields -- Analysis ,Data structures -- Research ,Object-oriented programming -- Usage ,Business ,Electronics ,Electronics and electrical industries - Abstract
A structure of classes suitable for a general electromagnetic field analysis program is presented. This class structure is highly decoupled in order to be implemented in a distributed environment. The data structure was tested on a simple case, being implemented in both C++ and JAVA languages. Index terms - Electromagnetic analysis, Finite element methods, Object oriented programming, Object oriented methods
- Published
- 1998
47. Designing access methods for bitemporal databases
- Author
-
Kumar, Anil, Tsotras, Vassilis J., and Faloutsos, Christos
- Subjects
Databases -- Research ,Data structures -- Research ,Access control (Computers) -- Research ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
By supporting the valid and transaction time dimensions, bitemporal databases represent reality more accurately than conventional databases. In this paper, we examine the issues involved in designing efficient access methods for bitemporal databases, and propose the partial-persistence and the double-tree methodologies. The partial-persistence methodology reduces bitemporal queries to partial persistence problems for which an efficient access method is then designed. The double-tree methodology 'sees' each bitemporal data object as consisting of two intervals (a valid-time and a transaction-time interval) and divides objects into two categories according to whether the right endpoint of the transaction time interval is already known. A common characteristic of both methodologies is that they take into account the properties of each time dimension. Their performance is compared with a straightforward approach that 'sees' the intervals associated with a bitemporal object as composing one rectangle, which is stored in a single multidimensional access method. Given that some limited additional space is available, our experimental results show that the partial-persistence methodology provides the best overall performance, especially for transaction timeslice queries. For those applications that require ready, off-the-shelf, access methods, the double-tree methodology is a good alternative. Index Terms - Bitemporal databases, access methods, transaction-time, valid-time, data structures.
- Published
- 1998
48. Delay abstraction in combinational logic circuits
- Author
-
Kobayashi, Noriya and Malik, Sharad
- Subjects
Block designs -- Observations ,Integrated circuits -- Design and construction ,Data structures -- Research - Published
- 1997
49. Authorization and revocation in object-oriented databases
- Author
-
Majetic, Ivo and Leiss, Ernst L.
- Subjects
Object-oriented databases -- Research ,Data security -- Research ,Access control (Computers) -- Research ,Data structures -- Research ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
Few studies of object-oriented databases deal with their security, a fundamental aspect of systems with complex data structures. Most authorization systems give users who own resources only some basic control over them; here, we provide users with more direct control over their resources by associating with each grant propagation numbers. Propagation numbers govern the grantability and exercisability of the privileges. Of particular interest in our study of authorization in an o-o environment is the combination of inheritance and granting of privileges. Diverse policies are discussed and implemented in a test-bed system. Index Terms - Object-oriented systems, inheritance, granting, revoking, bounded propagation of privileges.
- Published
- 1997
50. An empirical study of domain knowledge and its benefits to substructure discovery
- Author
-
Djoko, Surnjani, Cook, Diane J., and Holder, Lawrence B.
- Subjects
Data compression -- Research ,Data structures -- Research ,Computer programming -- Research ,Business ,Computers ,Electronics ,Electronics and electrical industries - Abstract
Discovering repetitive, interesting, and functional substructures in a structural database improves the ability to interpret and compress the data. However, scientists working with a database in their area of expertise often search for predetermined types of structures or for structures exhibiting characteristics specific to the domain. This paper presents a method for guiding the discovery process with domain-specific knowledge. In this paper, the SUBDUE discovery system is used to evaluate the benefits of using domain knowledge to guide the discovery process. Domain knowledge is incorporated into SUBDUE following a single general methodology to guide the discovery process. Results show that domain-specific knowledge improves the search for substructures that are useful to the domain and leads to greater compression of the data. To illustrate these benefits, examples and experiments from the computer programming, computer-aided design circuit, and artificially generated domains are presented. Index Terms - Data mining, minimum description length principle, data compression, inexact graph match, domain knowledge.
- Published
- 1997
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.