1,738 results
Search Results
2. The JDL model of data fusion applied to cyber-defence — A review paper.
- Author
-
Schreiber-Ehle, S. and Koch, W.
- Abstract
In the ever growing literature on countering the cyber threat, the so-called JDL model of data fusion, well established in the information fusion community, has been applied to characterize the inner structure of problems within cyber defence and their mutual relationship. The overarching goal is to provide contributions to comprehensive cyber situational awareness by producing timely situation pictures. Cyber situational awareness, however, is prerequisite to taking appropriate actions, i.e. for “defence”. In this review paper, we provide an overview of what has been proposed in this context by various authors and collect basic insights published in the open literature. By doing so, we wish to provide an overview of the current discussion which reflects our own apprehension and prioritization. Moreover, we stress our opinion where relevant research questions are to be expected. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
3. Extracting Information from Scientific Papers in the Cloud.
- Author
-
koda, Petr, perka, Svatopluk, and Smr, Pavel
- Abstract
This paper deals with a system for extracting information from scientific papers. We analyze drawbacks of an existing implementation running on the N1 Grid Engine. Reasons for moving extraction to the Cloud are presented next. The architecture of the Cloud port is discussed and the links to the API and the platform developed within the mOSCAIC project are elaborated. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
4. Using paper prototyping to assess the perceived acceptance of MedMate: A home-based pill dispenser.
- Author
-
Chua, Jit Chee, Foo, Min-Hui, Cheong, Yian Ling, Ng, Jamie, and Toh, Chen Koon
- Abstract
Home-based pill dispensers are one of many devices that are designed to aid medication management and improve adherence. Such devices are common in the United States of America but not widely exploited in Asia. We designed a home-based pill dispenser known as MedMate. Users' perceived acceptance of a new technology such as MedMate affects its actual usage. Thus, we created a paper prototype of MedMate to get feedbacks from users early in the development cycle. We conducted a user study consisting of a questionnaire adapted from the Technology Acceptance Model (TAM), followed by an interview to gain more in-depth understanding of the users' perceived acceptance and more importantly their concerns regarding MedMate. The results of the user study show that the participants were positive about its usefulness and ease-of-use. More than two third of the participants were willing to accept and use the pill dispenser in their homes but with some concerns. Their main concerns include affordability, portability, reliability, safety and security, privacy, maintenance, durability, efficiency of reminder, accessibility and capability in handling different forms of medication. These findings resonate with our prior studies on the acceptance of other home-based medical devices such as mobile ECG-measuring devices. Thus, most of these factors can be extended for consideration when designing other home-based medical devices. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
5. Self-efficacy and its impacts on academic researchers' absorptive capacity in smart phones acceptance: Conceptual paper.
- Author
-
Bamasoud, Doaa M., Iahad, Noorminshah A., and Rahman, Azizah Abdul
- Abstract
Information and Communications technologies (ICT) play a critical role in expanding the boundaries of the academic research activities. Where the academic researchers utilize the ICT mostly to enhance their research. Due to the increasingly powerful capabilities of smart phones technology, smart phones become one of the wide spread technologies worldwide. Thus, many studies focused on studying the acceptance of smart phones. However a lack exists in studying the acceptance of smart phones for academic researchers, although of the importance and capabilities of such technologies for the researchers. On account of the role of individual's IT knowledge plays in the technology acceptance and the impacts of willingness to learn on it, the paper integrates the knowledge with the capability of using that knowledge and the learning willingness constructs to the proposed model. Hence, this conceptual paper investigates the acceptance of smart phones by the academic researchers using technology acceptance model. And integrates the absorptive capacity construct (represent IT knowledge with the capability of using that knowledge) with self-efficacy construct (represent willingness to learn) to the model. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
6. Extraction of Answer Image of Choice Questions in Examination Paper.
- Author
-
Zhipeng, Xu
- Abstract
firstly, the skewed image of examination paper is corrected by Hough transform, and then mathematical morphology methods are used to locate the frame lines in table. Then connected component analysis is utilized to locate the form region based on maximum area feature. Finally the form frame line is removed according to the vertical gradient, and the answer image is achieved by analyzing the coordinates of objects. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
7. Storing XML documents and XML policies in relational databases.
- Author
-
El-Aziz, Abd El-Aziz Ahmed Abd and Kannan, A.
- Abstract
In this paper, We explore how to support security models for XML documents by using relational databases. Our model based on the model in [6], but we use our algorithm to store the XML documents in relational databases. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
8. Using web conference system during the consultation hours.
- Author
-
Kiss, Gabor
- Abstract
Students of the undergraduate course Introduction to Informatics get acquainted with computer architecture, operating system, computer network and data encryption in history as well as with up to date applications. During the consultation hours the students have the opportunity to ask the problematic questions to understand the learning material that they can download after the presentations. The experiences show the students wait for the last day before the test to download and learn the learning matherial and they do not use the opportunities of the consultation hours. Usually there are just 1 or 2 students coming to ask questions at this time. BigBlueButton is an open source web conferencing system and the students in the first semester could use it to ask the problematic questions from home. The experiences show this way of consultation is near to students' habit and helps them to understand the learning material better and score a higher paper mark. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
9. Work in progress: Mesoscopic analysis of engineering education scholarship in electrical and Computer Engineering, 2002–2011.
- Author
-
Fayyaz, Farrah and Jesiek, Brent K.
- Abstract
Engineering education remains a relatively new and rapidly developing research field. As a result, there are significant variations in the quantity and kinds of engineering education research being conducted and published in different engineering disciplines and local/national contexts. Responding to a lack of systematic attempts to study such dynamics, this paper describes our ongoing efforts to investigate the quantity and nature of engineering education scholarship in Electrical and Computer Engineering (ECE) and allied fields over the last ten years. More specifically, we report preliminary results of an in-depth quantitative and qualitative analysis of journal articles (n=664) published in IEEE Transactions on Education from 2002 to 2011, including co-authorship patterns, geographic distribution of authors, and research funding reported. To help contextualize the study, the authors also discuss other studies of publication trends in engineering education research. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
10. An improved wavelet digital watermarking software implementation.
- Author
-
Khalifa, Othman O, binti Yusof, Yusnita, Abdalla, Aisha-Hassan, and Olanrewaju, R.F.
- Abstract
There are quite a number of researches in proposing digital image watermarking using Discrete Wavelet Transform (DWT). However, it is clearly observed that each of them is individually distinctive in terms of its scopes and applications. In this paper, a proposed system was thoroughly explained. It includes general work flow, proposed algorithms for both original method and improved method, which were named as subband matching and selective subband matching, respectively and various attacks performed for evaluation. Modification was made to the algorithms where only selected matching subbands were used in embedding and extracting the watermark in this improved method. These methods were compared and tested against attacks using standard benchmark, Stirmark. Experimental results of the proposed methods' performance were analyzed using Peak Signal to Noise Ratio (PSNR) calculations and Structural SIMilarity (SSIM) index for watermark imperceptibility and robustness, respectively. The improvement could be seen in quality of the watermarked image (imperceptibility) and of extracted watermark (robustness). [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
11. Learning from user experience in games.
- Author
-
Brandstetter, Matthias F. and Ahmadi, Samad
- Abstract
Several concepts from traditional research on Artificial Intelligence (AI) need to be trained before they can be used. For example, when applied to a computer game, its AI framework has to “learn” how the game should be played. However, such trainings may not be trivial due to the often complex game world environments. This paper presents a novel training approach for game AI frameworks where, instead of manually a priori defining game playing rules at design time, training of the AI system takes place interactively while sample games are being played. Additionally this paper provides an example of modeling the game world so that game objects, such as Non-Playing Characters (NPCs), can be trained interactively. We also give an outlook to our ongoing research on the incorporation of the presented interactive training approach to a real-world game, namely an autonomous controller for the arcade Ms. Pac Man video game using Case-Based Planning. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
12. Strategies of teaching model in Visual FoxPro database system course.
- Author
-
Jia, Xiaojun, Liu, Jinping, Chen, Baoming, and Xu, Juding
- Abstract
Visual FoxPro database system course is a required course of economics and management major in our university. According to the needs of this curriculum reform, being combined with the modern teaching methods and network technology, this pape r proposes the reform strategies in this course from the teachers and teaching textbook configuration, teaching content, teaching methods, and assessment methods. We explicitly point out that the ne w teaching model is to change from emphasizing theory teaching to practical teaching, from classroom teaching to network platform teaching, from paper testing to non-paper machine testing. Reform results show that the reform strategies can improve the teaching e ffect, increase the students' interest in learning, and enhance their independent operation ability to solve practical problems. The reform of teaching model in this course is warmly welcomed by students. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
13. Research on computer color recovery system for traditional Chinese painting.
- Author
-
Ding, Haiyan and Ding, Huaidong
- Abstract
It is said that replication of cultural relic is an important way to protect cultural relic in all ages, according to experts. And color recovery is the key point of costly and rare painting and calligraphy which are in severe danger. According to the characteristics of traditional Chinese painting's pigments, this research used Visual C++ IDE and constructed a computer color recovery adjustment module for traditional Chinese painting, based on chromatics theory and computer digital image processing technology. For any designated color region of Chinese painting, the system can carry out tiny adjustment of color recovery. An automatic Chinese painting's color recovery model is also constructed, based on the fact that the color frequently used in Chinese painting will become natural aging along with the time, paper materials and preserving condition. And color automatic recovery function of Chinese painting was realized. Furthermore, associated applications of digital restoration system and color management system are also discussed in this paper, and good color consistency result in digital restoration is achieved. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
14. A multi-agent framework for dynamic task assignment and delegation in task distribution.
- Author
-
Khamees Itaiwi, Al-Mutazbellah, Ahmad, Mohd Sharifuddin, Abd Hamid, Nurzeatul Hamimah, Jaafar, Nur Huda, and Mahmoud, Moamin A
- Abstract
Agent-based systems are deployed in distributed data and control environments where resources usage are claimed to be more efficient. Researches have shown that intelligent agents offer excellent performance in distributed environments influenced by their autonomous or semi-autonomous actions with minimal human intervention. Assigned with roles, these agents are thus viewed as human-like societies that are capable of planning, coordinating, and executing tasks to achieve their goals. In a similar fashion, we model agents in our work based on the peculiarities of human society to achieve optimum performance in a dynamic multi-skills environment. Due to time and task constraints, agents are assigned with multiple roles and perform various other tasks. In this paper, we discuss a framework for dynamic task assignment and delegation in handling multi-task problems for a society of agents to achieve a common goal in allocated time. To test the framework, we adopt and animate the Final Examination Paper Preparation (FEPP) process in a multi-agent system that involves a number of faculty branches where multiple and dynamic task delegations are possible. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
15. Guidelines for procedures of a harmonised digital forensic process in network forensics.
- Author
-
Sibiya, George, Venter, H.S., Ngobeni, Sipho, and Fogwill, Thomas
- Abstract
Cloud computing is a new computing paradigm that presents fresh research issues in the field of digital forensics. Cloud computing builds upon virtualisation technologies and is distributed in nature. Depending on its implementation, the cloud can span across numerous countries. Its distributed nature and virtualisation introduces digital forensic research issues that include among others difficulty in identifying and collecting forensically sound evidence. Even if the evidence may be identified and essential tools for collecting the evidence are acquired, it may be illegal to access computer data residing beyond the jurisdiction of a forensic investigator. The investigator needs to acquire a search warrant that can be executed in a specific foreign country - which may not be a single country due to the distributed nature of the cloud. Obtaining warrants for numerous countries at once may be costly and time consuming. Some countries may also fail to comply with the demands of cloud forensics. Since the field of digital forensics is itself still in its infancy, it lacks standardised forensic processes and procedures. Thus, digital forensic investigators are able to collect evidence, but often fail in following a valid investigation process that is acceptable in a court of law. In addressing digital forensic issues such as the above, the authors are writing a series of papers that are aimed at providing guidelines for digital forensic procedures in a cloud environment. Live forensics and network forensics constitute an integral part of cloud forensics. A paper that deals with guidelines for digital forensic procedures in live forensics was submitted elsewhere. The current paper is therefore the second in a series where the authors propose and present guidelines for digital forensic procedures in network forensics. The authors eventually aim to have guidelines for digital forensic procedures in a cloud environment as the last paper in the series. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
16. Projective rectification of infrared image based on projective geometry.
- Author
-
Li, Xiaolu, He, Tao, Xu, Lijun, Chen, Lulu, and Guo, Zhanshe
- Abstract
Projective distortion caused by tilted camera brings great inconvenience to subsequent image processing and pattern recognition. In this paper, a method based on projective geometry is introduced to correct the projective distortion of infrared images. The dual conic in distortion image, which contains all the information of projective rectification, is identified firstly, and then the projective transformation matrix is derived by decomposing the dual conic. Finally, optimization is carried out to get the optimal result. Experimental results show that the method used in this paper can remove the projective distortion of infrared image effectively. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
17. Design and implementation of an online self-training system for the Computer System Platform course.
- Author
-
Li, Yujun, Zhu, Limiao, and Wang, Xiaoying
- Abstract
As a newly designed major course in the direction of the "Information Technology", the "Computer System Platform" course involves comprehensive contents including computer hardware platforms, software platforms, operating platforms and also application platforms. Thus, it is difficult for students to learn this course since it covers a wide range several of knowledge. Hence, in this paper we designed and implemented an online learning and self-training system based on J2EE architecture to address these issues. The system has the functionality of importing the training questions into database in batch, and hence teachers can easily import a variety of formats of classified training questions into the database and it reduces the burden of single question importing. By using the system, students can review the knowledge learned in class, practice specifically of each chapter, and consolidate them in time. Moreover, students can also conduct a simulated examination through the automatic exam generation subsystem, which contributes to importing the effect of self-learning. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
18. Research on question bank maintenance for network open examination system.
- Author
-
Liu, Boqin and Zou, Xinchun
- Abstract
With the computer-based teaching deepening of reform, this paper based on network test system open exam according to the different basic skills of computer professionals have different needs for maintenance, increased for different professions different operating question, the unification of principles and propositions propositional form, and test results were analyzed to prove that the hierarchical classification of my school teaching reform is a success. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
19. A method of constructing user interest model based on semantics.
- Author
-
Qinghong, Yang, Neng, Xiong, and Hao, Hu
- Abstract
In this paper, the representation method of user interest model is a combination of vector space model with Ontology. It introduces the algorithm of semantic similarity when calculating the term weight, and considers the semantic similarity of words, and then the term weight is updated through the Bayesian network. This paper focuses on the steps of building and updating the user interest model, and verified the validity of the model by experiment. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
20. A Study on Extraction Method of Synonyms in Specification Documents.
- Author
-
Kawai, Yasushi, Yoshikawa, Tomohiro, Furuhashi, Takeshi, Hirao, Eiji, Kuno, Ayako, and Gotoh, Tomohisa
- Abstract
Recently, the document information managed in companies tend to be more complex and various. Specification documents are used for the technical transfer and inheritance of manufactures and services. However, the description and the meaning of the component words in specifications are often inconsistent or multiple, because a specification document is made by the persons in charge of various parts. Then the readers may misunderstand the contents of specifications by them. This paper focuses on synonyms, multiple description of words for a meaning or a word, in specification documents and proposes an extraction method of them considering the co-occurrence words of component words. This paper applies the proposed method to a test data, in which some words in an actual specification document are replaced with another words, and studies the effectiveness of the proposed method. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
21. ARP spoofing detection algorithm using ICMP protocol.
- Author
-
Jinhua, Gao and Kejian, Xia
- Abstract
Today, there are an increasing number of attack technologies among which ARP spoofing attack is considered as one of the easiest but dangerous method in local area networks. This paper discusses ARP spoofing attack and some related works about it first. On these bases, the paper proposed an efficient algorithm based on ICMP protocol to detect malicious hosts that are performing ARP spoofing attack. The technique includes collecting and analyzing the ARP packets, and then injecting ICMP echo request packets to probe for malicious host according to its response packets. It won't disturb the activities of the hosts on the network. It can also detect the real address mappings during an attack. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
22. Association rule analysis using biogeography based optimization.
- Author
-
Bhugra, Divya, Goel, Samiksha, and Singhania, Vipul
- Abstract
In recent years, data mining has become a global research area for acquiring interesting relationships hidden in large data sets. Data Mining has been used in various application domains such as market basket data, bioinformatics, medical diagnosis, web mining and scientific data analysis. In this paper, we have tried to optimize the rules generated by Association Rule Mining using Biogeography Based Optimization(BBO). BBO has a way of sharing information between solutions depending on the migration mechanisms. The motivation of this paper is to use the feature of BBO for finding more accurate results. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
23. Image Preprocessing Algorithm Analysis Based on Robertsoperator and Connected Region.
- Author
-
Chen, Long, Liu, Dan, Xi, Xiuhan, and Xie, Hui
- Abstract
To solve the problem of urban congestion, various vehicles management technology have been developed vigorously, such as the parking management system development is relatively mature, also using of the varied technology. This paper focuses on edge detection which are based on the static image preprocessing. Edge detection makes use of Roberts operator mostly, and the operator has been improved in-depth analysis in this paper. In addition, this paper advanced a new image segmentation algorithm which is based on connected region. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
24. Research and Implementation of the High-Availability Spatial Database Based on Oracle.
- Author
-
Wu, Xiaochun, Wang, Kai, Su, Zixuan, and Liu, Yanjun
- Abstract
With the technology of geographic information system (GIS) developing and the application of enterprise-level GIS gradually wide, it is necessary to build the high-availability spatial database. Now there are not commercial spatial database software which can realize the high-availability. The paper proposes the architecture of high-availability spatial database based on popular object-relation database software-Oracle and verify its feasibility by experiment. The paper provides a new idea to realize the high-availability spatial database. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
25. A Pinyin Input Method Editor with English-Chinese Aided Translation Function.
- Author
-
Li, Dong
- Abstract
This paper develops a kind of interactive translation tool aiming to the professional translators -- a Pinyin Edit Method with English-Chinese aided translation function. From the point of view of professional translators, this tool integrates current Pinyin input methods, machine translation based on statistics and n-gram statistics technology, reduces the tapping times and saves translation time. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
26. Simple Median based information fusion in wireless sensor network.
- Author
-
Singhal, Deepti and Garimella, Rama Murthy
- Abstract
The accuracy of a system is measured by the deviation of the system's results from the actual results. Information fusion deals with the combination of information from same source or different sources to obtain improved fused estimate with greater quality or greater relevance. As larger amount of sensors are deployed in harsher environment, it is important that sensor fusion techniques are robust and fault-tolerant, so that they can handle uncertainty and faulty sensor readouts. The sensor nodes in Wireless Sensor Network (WSN) are constrained with computation and communication resources, and efforts are required to increase the performance measures of the network. Thus sensor fusion techniques should be simple with less computation complexity. In this paper we propose a novel Median based sensor fusion function named D function. It is shown that the proposed D function satisfies the lipschitz condition. Paper also presents some of the ideas which can open new areas for research in fusion problem. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
27. Composite subdivisions for 3D quad-triangle meshes.
- Author
-
Ng, K. W., Andi, S. W., and Ghauth, K. I.
- Abstract
This paper researches on the real time subdivision of surface rendering using approximation scheme and focuses on two methods — Catmull-Clark and Loop subdivision. The two methods are commonly employed separately in many existing applications and graphics engine. Catmull-Clark outperforms on quad meshes and Loop subdivision superiors on triangle meshes. The exchange of the meshes produces ugly polygonal meshes. In this paper, we propose to combine the methods and subdivide the quad-triangle model simultaneously. We average the vertex points generated by both the functions. This closes up the gap on the original vertex points. Our result turns out to be smoother and without defect. Our proposed method is potential to be employed in other graphics applications such as morphing, animation and etc. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
28. A vector-space retrieval system for contextual awareness.
- Author
-
Delaveau, Ludovic, Loulier, Benjamin, Matson, Eric T., and Dietz, Eric
- Abstract
This paper introduces a retrieval system based on context rather than content. The system uses a set of vector spaces to represent the different contextual characteristics (position, time, sound environment, etc.). In this model both the current context and the items in the corpus are represented by vectors. We then use vector similarities to identify the relevant items given the context. In order to take into account the user's own perception of the context we also use learning techniques based on the user feedback. This paper also presents two applications we are currently developing, which make use of this retrieval system: a contextual adaptive User Interface selecting the right configuration profiles matching the current context as acquired by a computer, and an augmented reality application for iPhone suggesting to the user activities and places that could be of interest to him given its current environment. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
29. The design and implementation of a network performance management system based on SNMP.
- Author
-
Chen, Liang and Xu, Ting
- Abstract
Network performance management need to monitor, analyze and record several of performance indexes in order to find out the switch happens or potential danger. This paper briefly introduced the goals of network performance management and basic functions, and network performance analysis based on SNMP including the port status analysis, port flow analysis and equipment resources utilization analysis, etc. And to be able to analyze network performance from multi-angle, this paper designed and implemented network performance management system based on SNMP, which included mainly the network performance data acquisition, data analysis and show to the user in visual way, critical management, etc. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
30. Test calculation for logic and short-circuit faults in digital circuits.
- Author
-
Sziray, Jozsef
- Abstract
In the first part, the paper presents a test calculation principle which serves for producing tests of logic faults in digital circuits. The name of the principle is composite justification. The considered fault model includes stuck-at-0/1 logic faults. Both single and multiple faults are included. In this paper only combinational logic is taken into consideration. The computations are performed at the gate level. In the second part of the paper, the composite justification is extended to an other fault class, namely, short-circuit faults. A short circuit is an erroneous galvanic coupling between two circuit lines. The calculation principle is comparatively simple. It is based only on successive line-value justification, and it yields an opportunity to be realized by an efficient computer program. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
31. DCA: Dynamic Challenging Level Adapter for Real-time Strategy Games.
- Author
-
Chang, Shin-Hung and Yang, Nai-Yan
- Abstract
Recently, Real-Time Strategy (RTS) games, such as Star Craft and Age of Empire, become more and more popular. Reasons of these RTS games attracting many game players are not only the fancy game presentation but also challenging game AI of computer opponents. In order to match game challenging level to different game players, these RTS games always provide several default difficulty levels for game players' choosing. However, settings of these difficulty levels cannot always accommodate challenging level requirements of different level game players. Therefore, this paper proposes a Dynamic Challenging Level Adapter (DCA) mechanism to automatically adapt computer opponent's behaviors for different game players. Each game player doesn't need to choose a difficulty level in advance and the game AI of computer opponent controlled by the DCA mechanism can dynamically adapt for meeting the challenging level of each game player. This paper proposes that the warrior is the most important element to dominate the result of each match in RTS games. Therefore, the main idea underlying the DCA mechanism is based on analyzing warrior capabilities between two game players and real-time adjusting action strategies to fight with its opponent. In order to test the effectiveness of the DCA mechanism, this study applies the most popular RTS game, Star Craft II (SC II), as the experimental game platform. In the experiments, this study uses default difficulty level AI of SC II game to emulate game players and makes these emulated game players to fight with the opponents controlled by the DCA mechanism. Additionally, a challenging rate (CR) formula is proposed as a strength evaluation between two opponents. From the experimental results, the difference of CR value of computer opponent improved by the DCA mechanism can be reduced by more than 80%. Additionally, the duration of a game match is usually double. Furthermore, the winner is always decided in the rear of each game match. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
32. An Implementation of Parallel 1-D FFT on the K Computer.
- Author
-
Takahashi, Daisuke, Uno, Atsuya, and Yokokawa, Mitsuo
- Abstract
In this paper, we propose an implementation of a parallel one-dimensional fast Fourier transform (FFT) on the K computer. The proposed algorithm is based on the six-step FFT algorithm, which can be altered into the recursive six-step FFT algorithm to reduce the number of cache misses. The recursive six-step FFT algorithm improves performance by utilizing the cache memory effectively. We use the recursive six-step FFT algorithm to implement the parallel one-dimensional FFT algorithm. The performance results of one-dimensional FFTs on the K computer are reported. We successfully achieved a performance of over 18 TFlops on 8192 nodes of the K computer (82944 nodes, 128 GFlops/node, 10.6 PFlops peak performance) for a 2^41-point FFT. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
33. Evaluation of engineering laboratories.
- Author
-
Rashid, Muhammad, Tasadduq, Imran A., Zia, Yousuf Irfan, Al-Turkistany, Mohammad, and Rashid, Saima
- Abstract
Engineering studies consist of two parts: theory lectures and laboratory practices. Effectiveness of laboratories plays an important role in providing necessary design skills. However, evaluation of laboratories can be subjective and inconsistent. Consequently, a criterion is required to evaluate the engineering laboratories. This paper proposes an evaluation criterion for assessing the effectiveness of engineering laboratories in terms of pedagogic aspects. The identified pedagogic aspects in this paper are: relationship between theory and laboratory practice, content level, activity level, learning environment and laboratory manual. We evaluated seven different laboratories and generated recommendations based on our evaluation results. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
34. Document Classification through Building Specified N-Gram.
- Author
-
Ko, Byeongkyu, Choi, Dongjin, Choi, Chang, Choi, Junho, and Kim, Pankoo
- Abstract
This paper proposed a method to classify textural documents using specified n-gram data set. Human lives in the world where web documents have a great potential and the amount of valuable information has been consistently growing over the year. There is a problem that finding relevant web documents corresponding to what users want is more difficult due to the huge amount of web size. For this reason, many approaches have been suggested to overcome this obstacle. The most important task is classifying textural documents into predefined categories. Over the years, many statistical approaches were introduced though, no one can find perfect solution yet. In this paper, we suggest a method for textural document classification using n-gram model. The n-gram data frequency has a great potential to find similarities between documents. For this reason, we construct our own n-gram data sets from research papers. If an unknown document comes to the system, the system will extract n-grams from the given unknown documents. After this step, n-grams from unknown document and n-grams in previous data sets will be compared by proposed similarity measurement. The precision rate of this method comes to 86%. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
35. Communication Protocols in Layered Groups with Heterogeneous Clocks.
- Author
-
Duolikun, Dilawaer and Takizawa, Makoto
- Abstract
In distributed systems, a group of multiple processes are required to cooperate with each other to achieve some objectives. In this paper, we consider a distributed system where there is o centralized coordinator. Each member process is peer, i.e. each peer has to autonomously make a decision by itself through communicating with other peers in a group. Each peer sends a message to every peer and every other peer receives the message in a group. In group communication, each message sent by a peer has to be causally delivered to every peer. Types of clocks, linear clock, vector clocks, and physica clocks are used to causally order messages. The linear clock can be used in a scalable group since message length is O(1). However, some messages which are not to be causally ordered are ordered. On the other hand, all and only messages to be causally ordered are ordered in the vector clock. However, the vector clock cannot be adopted in a scalable group due to the message length O(n) for the number n of peers of the group. A computer is equipped with a physical clock which is synchronized with a time server. Each peer can read time at a physical clock of a computer where the peer is performed. Messages are totally ordered in terms of physical time when the messages are sent. However, a pair of messages to be causally ordered might be ordered in the reverse order due to difference of times shown by physical clocks. In a scalable group, it is not easy for each peer to send a message to every peer. In this paper, we discuss a hierarchical group to realize a scalable group. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
36. Development of an automatic health screening system for Student Health Education of University.
- Author
-
Miyazaki, Eiichi, Kamano, Hiroshi, Yamakata, Daisuke, Hori, Yukio, and Imai, Yoshiro
- Abstract
A Health Education Support System has been designed and being implemented for our students of Kagawa University in Japan. This paper describes the detail of An Automatic Health Screening System, which can play the former part role of our Health Education Support System, is described in detail in this paper. Major characteristics of our screening system are using IC card for user identification, interfacing several kinds of physical measuring devices like as height meter, weight meter, blood pressure monitor, etc. and retrieving measured data through Web-DB system. Human errors and incorrect identification can be reduced by means of automatic identification with IC card. Speedup of screening can be realized through interfacing between physical measuring devices and computers. And efficient retrieval of measured data can be performed with Web-DB system and Distributed campuses network environment in our University. With our automatic health screening system, almost students can easily participate in University-level health screening and investigate/retrieve their proper physical data for their healthcare through dedicated Web-DB system on the network environment. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
37. Digital Forensics Best Practices and Managerial Implications.
- Author
-
Ali, Khidir M.
- Abstract
Digital Forensics is a new discipline in computer science, which was developed as the response to the increased unauthorized activities in computer and information systems. As a result of the increasing dependency of the society and the business on information systems and the fact that e-commerce and online business became essential part of today's world business, computer attacks and cyber crimes are continually on rise. Due to various reasons, the legal system, law enforcement, computer forensics and investigations seem to be behind in their efforts to track down criminals and successfully to prosecute them. This paper gives an overview of forensic computing and discusses key issues to be considered in forensic investigations and digital evidence analysis processes. The paper also identifies best practices for digital forensics investigation process and addresses key elements of forensic computing managerial implications. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
38. Detection of Electromagnetic Trojans Based on API Cycle Mining.
- Author
-
Chen, Rongmao, Zhang, Bofeng, Ren, Jiangchun, and Gong, Zhenghu
- Abstract
Computer electromagnetic radiation Trojan arouses concern gradually due to its particularity in route of transmission. Currently, technologies for preventing electromagnetic radiation are mainly divided into two types of software and hardware. Hardware has a high cost and limited range of application, while most of current software protection mechanism is in theoretical research stage with quite complex principles. This paper makes exploration on prevention methods for displayer electromagnetic Trojan. Trojans need to be realized through changes of pixel on the displayer when working, their function called by their bottom of system would also present characteristics of corresponding sequence. Based on this characteristic, this paper proposes Trojan detection methods based on API sequence cycle mining, and pixel interference is also proposed as a prevention method at the end of this paper. It is indicated by experiments, the method could better detect displayer electromagnetic Trojan programs and has a strong versatility, low cost, and easy to deploy applications. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
39. Perspectives of using temporal logics for knowledge management.
- Author
-
Mach-Krol, Maria
- Abstract
The paper concerns the possibility of using temporal logics for knowledge management. The idea of knowledge management is presented, along with the most typical computer solutions for this area. The temporal aspect of knowledge management is pointed out. Having in mind this temporal aspect, the paper presents possible advantages of extending knowledge representation for knowledge management with temporal formalisms. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
40. A methodology applicable to building a classifier of pavement roughness measurement methods and devices.
- Author
-
Janota, Ales and Halgas, Jan
- Abstract
The paper presents a methodological approach used by the authors in the process of design of a knowledge base for the domain related to measurement of different pavement characteristics. As the main representation formalism a rule is considered. Individual steps taken in the process of knowledge discovery in data are discussed. The process is demonstrated using limited sample data related to pavement roughness measurements processed by inductive learning algorithms. The paper emphasizes procedural aspects of the design process rather than presentation of complete results that are not fully available yet because of the early stage of the research project being in the background. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
41. From NS-2 to NS-3 - Implementation and evaluation.
- Author
-
Kamoltham, Nattavit, Nakorn, Kulit Na, and Rojviboonchai, Kultida
- Abstract
NS-2 is the most famous simulator among other network simulators. Most of researchers use NS-2 to evaluate their new protocols/architectures. Although NS-2 has much resource and many available tools to generate different characteristics and network scenarios, source codes created on NS-2 is not able to be reused for real implementation. The releasing of NS-3 changes the way researchers work. NS-3 allows the researchers to work on both simulations and emulations with the same implementation and source codes. These can help researchers to evaluate their works on a single computer for simulation or on a real system without wasting their time to implement their works twice; one in NS-2 simulation and the other in real system. In this paper, we elaborate different points of protocol implementation between NS-2 and NS-3. We recommend a setting for mobility trace in order to obtain correct results between NS-2 and NS-3. Then, as a case study, we use DECA, which is a reliable broadcasting protocol for VANETs and previously implemented on NS-2. How to transfer DECA from NS-2 to NS-3 and how to validate its performance are shown. Moreover, an emulation of DECA on real system using NS-3 is described. The emulation results show a problem caused by asymmetric links which is not concerned in most of literatures. In this paper, therefore, we also propose a simple solution to help protocols' operation in the asymmetric link scenarios. The simulation results show that our solution can improve protocol performance. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
42. Computer system model in college examination.
- Author
-
Li, Biao, Gao, Teng, Xing, Dong, Wang, Yingnan, and Guan, Shengxin
- Abstract
With the wide application of computer technology in higher education area, the University English examination system also has its way to become a fast, efficient, economical, environmental, modern examination system. Chinese University English teaching mainly teach by means of a big class. The same paper at one time for all the students, wasting human labor, material, finances, students' precious learning time and losing fair. The wide application of computer systems will makes University English Examination became a great impetus for efficiency education. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
43. A multi-criteria hybrid citation recommendation system based on linked data.
- Author
-
Zarrinkalam, Fattane and Kahani, Mohsen
- Abstract
Citation recommendation systems can help a researcher find works that are relevant to his field of interest. Currently, most approaches in citation recommendation are based on a closed-world view which is limited to using a single data source for recommendation. Such a limitation decreases quality of the recommendations since no single data source contains all required information about different aspects of the literature. This paper proposes a citation recommendation approach based on the open-world view provided by the emerging web of data. It uses multiple linked data sources to create a rich background data layer, and a combination of content-based and multi-criteria collaborative filtering as the recommendation algorithm. Experiments demonstrate that the proposed approach is sound and promising. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
44. Computing the shortest path connecting n sequenced spheres in 3D.
- Author
-
Chou, Chang-Chien, Hsieh, Wan-Kuang, and Lien, I-Yan
- Abstract
This paper introduces the problem of computing the shortest Euclidean path of a plural number of sequenced spherical balls in 3D. Based on the author's previous works and the technique of pair-wisely local optimization, a heuristic algorithm was proposed to solve the problem. Empirical results are competing with random selection and the nearest neighbor search algorithm. The experiments have shown the correctness as well as the fast convergence property of the proposed algorithm so that it can be quickly conducted in the computer systems. Practitioners in the fields of solid modeling, prototyping, or other computer aided design and manufacturing may catch merits or implications from this paper. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
45. Formal verification of the heavy hitter problem.
- Author
-
Helali, Ghassen, Tahar, Sofiene, and Hasan, Osman
- Abstract
The heavy hitter problem is used to assess the frequency of occurrence of an element in a given data stream. It is one of the most widely used combinatorial tools in many safety-critical domains including medicine, telecommunications and stock exchange markets. Traditionally, the heavy hitter problem is analyzed using paper-and-pencil proofs, simulation or computer algebra systems. These techniques are informal and thus may result in an inaccurate analysis, which poses a serious threat to the reliability of the underlying applications of the problem. To overcome this limitation, we present a formal probabilistic analysis approach for the heavy hitter problem using a higher-order-logic theorem prover (HOL). The paper presents the higher-order-logic model of an algorithm for the heavy hitter problem. This model is then utilized to formally verify some interesting probabilistic and statistical properties associated with the heavy hitter problem in HOL. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
46. Secured web services for home automation in smart grid environment.
- Author
-
Khan, Adnan Afsar and Mouftah, Hussein T.
- Abstract
Smart grid aims to empower the current power grid with the integration of two-way communication and computer technology. The smart home contains a network that connects home elements like sensors, appliances and thermostat. In our previous work, we proposed an approach that used web services to remotely interact with smart home elements in a smart grid environment. These interactions include adjusting the temperature or reading energy consumption. We assumed a smart home with a wireless sensor network based on Zigbee. In this paper, we extend the system by including quality of service, security and XMPP (eXtensible Messaging and Presence Protocol). Different levels of access control are provided. Advantage of XMPP is that it provides near real-time communication and security. Secured web services also facilitate selling of energy back to the grid. The performance, advantage and limitations of the communications between user and elements via secured web services are demonstrated here. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
47. Using game level design as an applied method for Software Engineering education.
- Author
-
Emam, Ahmed and Mostafa, Mostafa G.
- Abstract
Hands-on training is considered one of the major requirements for most of Computer Science curriculum; some courses mandate lab components, while other courses do not require it. Students' learning outcomes are usually enhanced by “doing” rather than simply reading or attending lectures, therefore skills acquired by projects and hands-on training are a must have for the job market. This paper shows by experiments that students' learning outside classroom using a hands-on project. Students experienced first-hand how to work on team to achieve one goal of engineering and design a Video Game Level design. Level design is the process of creating levels for a video game. Level design process is usually split up among a group of designers. A level designer is a crucial member to a video game development team. Level design team responsible of creating the world that the player is put into. The design process consists of 4 phases: the idea on paper, hammering the level, testing the product, and back to the drawing board phase. There are several concepts should be taken into consideration during the design process; geography, sound and lighting concepts are highlighted and considered. The most important finding from this research is; group based project even if it is not part of a formal course offering using an attractive method, like game level design, can be of a great impact on students learning and skill building. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
48. Automatic Evaluation of Document Classification Using N-Gram Statistics.
- Author
-
Choi, Dongjin, Ko, Byeongkyu, Lee, Eunji, Hwang, Myunggwon, and Kim, Pankoo
- Abstract
Due to the development of World Wide Web technologies, people are living in the place flooding trillions of web pages in every moment. The amount of web size has been increasing dramatically. For this reason, it is getting more difficult to find relevant web documents corresponding to what users want to read. Classifying documents into predefined categories is one of the most important tasks in Natural Language Processing field. Over the years, many statistical and linguistical approaches have been applied to overcome traditional classification machine. However, it still remains in unsolved problem. There is a no perfect solution to machine understand human language yet. We have to consider every possibility for making machine think like human does. In this paper, we propose a method for classifying textural document using n-gram co-occurrence statistics which have a great possibility to find similarities between given documents. We also compare our proposed method with traditional method suggested by Keselj. This paper only covers simple approaches and still needs more sophisticated experiments. However, the performance using this method is better than the Keselj approach. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
49. One way to cluster documents by meaning using semantic-ball.
- Author
-
Seo, Chang-Yeol
- Abstract
Semantic technology dealing with meaning is difficult to process meaning as intended by people due to it is flexibility, instability, and even uncertainty. Moreover, there is no explicit definition of how to sense meaning and expression by computer. To overcome these barriers, this paper suggests Semantic-ball, which is a co-occurrence of different word-set, and based on assumptions that Semantic-ball has emergence self-organization attribution, group relationship not pair relationship, and specificity. Therefore this paper focuses on finding Semantic-ball to solve semantic problems. Approach to find Semantic-ball is based on conventional statistical ways, the FP Growth algorithm(Frequency Pattern Growth algorithm)[1], and the bottom-up method. While the previous researches have been based on the definition which used to pin down meaning to solve semantic problems, this paper does not bring any definition. Moreover, Semantic-ball is made only using computing ability automatically. Thus, Semantic-ball helps to sense word and cluster documents by meaning. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
50. Floating-point arithmetic on a reduced-instruction-set processor.
- Author
-
Gross, Thomas
- Abstract
Current single chip implementations of reduced-instruction-set processors do not support hardware floating-point operations. Instead, floating point operations have to be provided cither by a co-processor or by software. This paper discusses issues arising from a software implementation of floating point arithmetic for the MIPS processor, an experimental VLSI architecture. Measurements indicate that an acceptable level of performance is achieved, but this approach is no substitute for a hardware accelerator if higher precision results are required. This paper includes instruction profiles for the basic floating point operations and evaluates the usefulness of some aspects of the instruction set. [ABSTRACT FROM PUBLISHER]
- Published
- 1985
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.