123 results
Search Results
2. Title Paper: Natural computing: A problem solving paradigm with granular information processing.
- Author
-
Pal, Sankar K. and Meher, Saroj K.
- Subjects
NATURAL computation ,PROBLEM solving ,INFORMATION processing ,APPLICATION software ,COMPUTER systems ,COMPUTER science - Abstract
Highlights: [•] Granular computing aspects of natural computing. [•] Review of different granular soft computing research. [•] Biological motivation, design principles, application areas, open research problems and challenging issues of these models. [Copyright &y& Elsevier]
- Published
- 2013
- Full Text
- View/download PDF
3. Bibliometric Analysis on Research Trends of International Journal of Computers Communications & Control.
- Author
-
Wang, X. X., Xu, Z. S., and Dzitac, I.
- Subjects
AUTOMATION ,TREND analysis ,COMPUTER systems ,COMPUTER science ,SYSTEMS theory ,BIBLIOGRAPHIC databases - Abstract
International Journal of Computers Communications & Control (IJCCC) is an international journal in the fields of automation control systems and computer science. According to Web of Science (WoS), the first document of IJCCC was published in 2006. In this paper, we study the research trends of publications in IJCCC by performing bibliometric analysis from 2006 to 2019. 982 publications are selected from WoS after data preprocessing by VOS viewer and CiteSpace. Firstly, fundamental information of publications is explored including the type, the annual trend and the most cited publications in IJCCC. Secondly, characteristics of countries/regions, institutions and authors are presented in terms of evaluation indicators. Next, landscape analysis is conducted to show the development of IJCCC at level of countries/regions, institutions, authors and references, such as co-authorship analysis, bibliographic coupling analysis, co-citation and burst detection analysis, cooccurrence and timeline view analysis. Based on which, discussions about current challenges and possible research trends of IJCCC are provided. Finally, some main findings are summarized. This paper offers a valuable reference for scholars to understand the research trends of IJCCC and grasp hot topics related to relative fields. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
4. Parallel Cloth Simulation Using OpenGL Shading Language.
- Author
-
Hongly Va, Min-Hyung Choi, and Min Hong
- Subjects
CENTRAL processing units ,COMPUTER systems ,COMPUTER science ,STATISTICAL correlation ,COMPUTER software - Abstract
The primary goal of cloth simulation is to express object behavior in a realistic manner and achieve real-time performance by following the fundamental concept of physic. In general, the mass-spring system is applied to real-time cloth simulation with three types of springs. However, hard spring cloth simulation using the mass-spring system requires a small integration time-step in order to use a large stiffness coefficient. Furthermore, to obtain stable behavior, constraint enforcement is used instead of maintenance of the force of each spring. Constraint force computation involves a large sparse linear solving operation. Due to the large computation, we implement a cloth simulation using adaptive constraint activation and deactivation techniques that involve the mass-spring system and constraint enforcement method to prevent excessive elongation of cloth. At the same time, when the length of the spring is stretched or compressed over a defined threshold, adaptive constraint activation and deactivation method deactivates the spring and generate the implicit constraint. Traditional method that uses a serial process of the Central Processing Unit (CPU) to solve the system in every frame cannot handle the complex structure of cloth model in real-time. Our simulation utilizes the Graphic Processing Unit (GPU) parallel processing with compute shader in OpenGL Shading Language (GLSL) to solve the system effectively. In this paper, we design and implement parallel method for cloth simulation, and experiment on the performance and behavior comparison of the mass-spring system, constraint enforcement, and adaptive constraint activation and deactivation techniques the using GPU-based parallel method. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. Cyclic Autoencoder for Multimodal Data Alignment Using Custom Datasets.
- Author
-
Zhenyu Tang, Jin Liu, Chao Yu, and Wang, Y. Ken
- Subjects
CONVOLUTIONAL neural networks ,COMPUTER systems ,INTERNET of things ,COMPUTER science ,DEEP learning ,COVID-19 pandemic - Abstract
The subtitle recognition under multimodal data fusion in this paper aims to recognize text lines from image and audio data. Most existing multimodal fusion methods tend to be associated with pre-fusion as well as post-fusion, which is not reasonable and difficult to interpret. We believe that fusing images and audio before the decision layer, i.e., intermediate fusion, to take advantage of the complementary multimodal data, will benefit text line recognition. To this end, we propose: (i) a novel cyclic autoencoder based on convolutional neural network. The feature dimensions of the two modal data are aligned under the premise of stabilizing the compressed image features, thus the high-dimensional features of different modal data are fused at the shallow level of the model. (ii) A residual attention mechanism that helps us improve the performance of the recognition. Regions of interest in the image are enhanced and regions of disinterest are weakened, thus we can extract the features of the text regions without further increasing the depth of the model (iii) a fully convolutional network for video subtitle recognition. We choose DenseNet-121 as the backbone network for feature extraction, which effectively enabling the recognition of video subtitles in complex backgrounds. The experiments are performed on our custom datasets, and the automatic and manual evaluation results show that our method reaches the state-of-the-art. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
6. COVID-19 Automatic Detection Using Deep Learning.
- Author
-
Sanajalwe, Yousef, Anbar, Mohammed, and Al-E'mari, Salam
- Subjects
DEEP learning ,COMPUTER systems ,INTERNET of things ,COMPUTER science ,COVID-19 pandemic - Abstract
The novel coronavirus disease 2019 (COVID-19) is a pandemic disease that is currently affecting over 200 countries around the world and impacting billions of people. The first step to mitigate and control its spread is to identify and isolate the infected people. But, because of the lack of reverse transcription polymerase chain reaction (RT-CPR) tests, it is important to discover suspected COVID-19 cases as early as possible, such as by scan analysis and chest X-ray by radiologists. However, chest X-ray analysis is relatively time-consuming since it requires more than 15 minutes per case. In this paper, an automated novel detection model of COVID-19 cases is proposed to perform real-time detection of COVID-19 cases. The proposed model consists of three main stages: image segmentation using Harris Hawks optimizer, synthetic image augmentation using an enhanced Wasserstein And Auxiliary Classifier Generative Adversarial Network, and image classification using Conventional Neural Network. Raw chest X-ray images datasets are used to train and test the proposed model. Experiments demonstrate that the proposed model is very efficient in the automatic detection of COVID-19 positive cases. It achieved 99.4% accuracy, 99.15% precision, 99.35% recall, 99.25% F-measure, and 98.5% specificity. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. Pseudonym Mutable Based Privacy for 5G User Identity.
- Author
-
Saeed, Rashid A., Saeed, Mamoon M., Mokhtar, Rania A., Alhumyani, Hesham, and Abdel-Khalek, S.
- Subjects
5G networks ,INTERNET of things ,COMPUTER systems ,COMPUTER science ,ALGORITHMS - Abstract
Privacy, identity preserving and integrity have become key problems for telecommunication standards. Significant privacy threats are expected in 5G networks considering the large number of devices that will be deployed. As Internet of Things (IoT) and long-term evolution for machine type (LTE-m) are growing very fast with massive data traffic the risk of privacy attacks will be greatly increase. For all the above issues standards' bodies should ensure users' identity and privacy in order to gain the trust of service providers and industries. Against such threats, 5G specifications require a rigid and robust privacy procedure. Many research studies have addressed user privacy in 5G networks. This paper proposes a method to enhance user identity privacy in 5G systems through a scheme to protect the international mobile subscriber identity (IMSI) using a mutable mobile subscriber identity (MMSI) that changes randomly and avoids the exchange of IMSIs. It maintains authentication and key agreement (AKA) structure compatibility with previous mobile generations and improves user equipment (UE) synchronization with home networks. The proposed algorithm adds no computation overhead to UE or the network except a small amount in the home subscriber server (HSS). The proposed pseudonym mutable uses the XOR function to send the MMSI from the HSS to the UE which is reducing the encryption overhead significantly. The proposed solution was verified by ProVerif. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
8. Concepts as decision functions. The issue of epistemic opacity of conceptual representations in artificial computing systems.
- Author
-
Stacewicz, Paweł and Greif, Hajo
- Subjects
COMPUTER systems ,ARTIFICIAL intelligence ,KNOWLEDGE representation (Information theory) ,COMPUTER science ,CHARACTERISTIC functions - Abstract
The treatment of concepts as decision functions characteristic for computer science and Artificial Intelligence has its roots in the logical tradition of Gottlob Frege. In its modern incarnation, decision functions, embedded in concrete knowledge representations, describe either crisp or vague assignments of categorization decisions to objects. In order to allow for effective communication between humans and computer systems, these decisions should be either effectively explained by the system or explainable with respect to the system, which in turn depends on the degree of epistemic transparency of the computer system and the conceptual representations it uses. In this, paper we distinguish between two basic types of representation: logic-based and nature-based. With respect to the latter, we identify five contributing factors to their relative epistemic opacity: A. structural complexity, B. learning procedures, C. operating on uncertain data, D. lack of translation procedures for low-level rules and representations, and E. lack of complete knowledge about natural computing systems. We consider the fourth factor (D.) to be the most important source of opacity on the level of computer systems and the fifth (E.) the main contributing factor on the level of human knowledge. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. Introducing two complementary novel algebraic operations: Matrix-separation and Matrices-joining for programming evaluation and development.
- Author
-
Badr, Assem
- Subjects
ARTIFICIAL neural networks ,DATA mining ,BIG data ,COMPUTER systems ,COMPUTER science ,AUTOMATIC extracting (Information science) - Abstract
In the last decade, many computer science disciplines have been manipulating large amounts of data like artificial neural networks, data recognition, data mining and big data. Recently, many programmers and application designers have relied on the matrices in their algorithms to handle large numbers of data simultaneously to enriching the parallelism for their systems. However, due to the growth of the data amount in the matrices, there is a need to reshaping these matrices according to the capabilities of the computer systems that are processing these data. Consequently, our idea is developing algebraic operations that can represent the reshaping of matrices within the mathematical equations. In this paper, we develop fully two novel compatible algebraic matrices operations to solve this problem. The first operation called "Matrix-Separation" and the second operation denoted by "Matrices-Joining". These two novel operations assist the programmers and scientists to perform their programs evaluations and developments. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
10. An Ameliorated Methodology to Abstract Object Oriented Features from Programming System.
- Author
-
Handigund, Shivanand M. and Arunakumari, B.N.
- Subjects
UNIFIED modeling language ,COMPUTER security ,COMPUTER programming ,COMPUTER systems ,COMPUTER science - Abstract
In software testing through reverse engineering, the programming semiotics are to be transformed to Unified Modeling Language (UML) semiotics. This process needs the abstraction of UML syntactics, semantics from the programming system. As a preamble, this paper attempts to develop an automated methodology to abstract object oriented technology (OOT) components required for the class diagram from the programming system to be tested for its correctness & completeness. The methodology comprises of initial classing & slicing the programming system on the referenced attributes of the entry with no defined attributes and then using the sole referenced entry and sole defined entry on these sliced programming units the candidate key/s are identified which are used to find individual object structure of the class in our classing technique. The programming system is sliced based on individual class structure and then these slices are used to obtain the object methods through whose nature of the parameters we identify the interrelationship between the classes and their types. The intermediate clairvoyant study of parameter indicates the visibility of the object methods. This paper contains the detailed procedure for each method of our automated methodology. Therefore, their correctness and completeness is self evident. The proposed procedures are based on sound logical positivism and implicitly incorporate good database design and good software engineering principles. This paper reengineers the programming system into various appropriate UML design diagrams so as to reduce the enormity of the syntactics and semantics of programming language into limited syntactics and semantics. This enables to develop the correct and complete software testing methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
11. A Study on the Limitations of Evolutionary Computation and other Bio-inspired Approaches for Integer Factorization.
- Author
-
Mishra, Mohit, Gupta, Vaibhav, Chaturvedi, Utkarsh, Shukla, K.K., and Yampolskiy, R.V.
- Subjects
EVOLUTIONARY computation ,COMPUTER programming ,COMPUTER systems ,COMPUTER science ,COMPUTER security - Abstract
Integer Factorization is a vital number theoretic problem frequently finding application in public-key cryptography like RSA encryption systems, and other areas like Fourier transform algorithm. The problem is computationally intractable because it is a one-way mathematical function. Due to its computational infeasibility, it is extremely hard to find the prime factors of a semi prime number generated from two randomly chosen similar sized prime numbers. There has been a recently growing interest in the community with regards to evolutionary computation and other alternative approaches to solving this problem as an optimization task. However, the results still seem to be very rudimentary in nature and there's much work to be done. This paper emphasizes on such approaches and presents a critic study in details. The paper puts forth criticism and ideas in this aspect. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
12. Fuzzified Expert System for Employability Assessment.
- Author
-
Kumari, Rajani, Kumar, Sandeep, and Sharma, Vivek Kumar
- Subjects
EMPLOYABILITY ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
Employability is somebody's prospective for gaining and maintaining employment. Basically employability depends on basic three parameters and these parameters are as education, understanding power and personal development. It is the capability to achieve the preliminary employment, to continue it and to obtain different one, if it is required. This paper introduced an innovative knowledgeable system for valuation of employability through some fuzzy rules. The purpose and scope of this concern research is to observe the optimal valuation for employability. This concern research considers three employability skills as an input namely Education, Understanding power and Personal development and find out a novel crisp value for employability which is basically characterize the ability of employee. This paper uses twenty seven fuzzy rules, by using Mamdani type fuzzy inference system in Mat-lab for catches solitary value of output named as employability. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
13. High-Level Representation of Time in Diagrammatic Specification.
- Author
-
Al-Fedaghi, Sabah
- Subjects
TIME management ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
The notion of time is an important element in such systems as real-time embedded systems. Real-time systems have strict timing constraints, and their complexity is continuously increasing, making their design very challenging. This paper concerns a very high level of requirements specification used for system understanding and communication among stakeholders and as a base for development. It introduces a diagrammatic description of functional behavior of a system with nonfunctional constraints including timing plan. Specifically, this paper explores the presentation of time at this level of system description. The usability and feasibility of the proposed method are illustrated by applying it to examples. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
14. New Theory for Code-translation; Punctuator and Qualifier as Two Token Categories and Supplementary Concepts of Programming.
- Author
-
Hutabarat, Bernaridho I.
- Subjects
COMPUTER programming ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
Mastering the programming is no easy job because proper science of software does not yet exist. Solid and fundamental theories that apply universally are needed. Universalities ease mastering any kind of job. To ease mastering the programming, software science must answer what are the universal main concepts in programming, and the categories of token. This paper proposes punctuator and qualifier as two supplementary concepts of programming. Together with value, operation, type, and object; punctuator and universal serve as six main concepts of programming. This paper shows the six main concepts of programming serves as universal categories of tokens. The new theory for code-translation takes into account the universal categories for tokens. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
15. A Hierarchical Method for Solving Soft Nonlinear Constraints.
- Author
-
Hosobe, Hiroshi
- Subjects
COMPUTER graphics ,COMPUTER systems ,COMPUTER science ,COMPUTER programming ,COMPUTER security - Abstract
Constraints that express relationships among objects are used to model and solve various problems arising from fields such as artificial intelligence, software, and computer graphics. Soft constraints are often important for applications that involve complex relationships among objects. This paper proposes a new method for solving soft constraints. The method treats soft nonlinear con- straints with hierarchical preferences, and computes solutions that satisfy as many constraints with strong preferences as possible. It adopts the method of Lagrange multipliers to enable the accurate computation of local solutions. The paper also presents the result of a preliminary experiment using a simple geometric example. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
16. An Ameliorated Methodology for the Abstraction of Object Oriented Features from Software Requirements Specification.
- Author
-
Shivaram, A.M. and Handigund, Shivanand M.
- Subjects
REQUIREMENTS engineering ,COMPUTER systems ,COMPUTER science ,COMPUTER programming ,COMPUTER security - Abstract
The Business Process (BP) requirements is specified in the form of software requirements specification (SRS). This SRS serves as a base for software development. The software needs to be developed in a syllogized software development life cycle (SDLC) stages. The SRS which denote the requirements of BP is used as input to the analysis stage from which the paradigm dependent components are to be abstracted. Hither to the components are abstracted manually with the exception of hiatuses of semi-automated methods for few of the components. The SRS is construed with reference to the specific paradigm through which the design is to be developed. Unfortunately, no complete automated methodology exists for the abstraction of all paradigm dependent components. The automated methodology eliminates the possible human errors which would have ripple effect to damage the entire software. This paper develops an innovative, unique methodology to resurrect the SRS statements as modular statements. Further develops an automated methodology for abstraction of control and data flow graphs. Further it develops an automated methodology for abstraction of useful components required for the class diagram. The class diagram emphasizes both structural and behavioral aspects. This facility is effectively used to abstract object class attributes, object methods, visibility, signature, return type etc. Information systems are developed through software projects which use the specific software requirements specification (SRS) provided for them as the base. The SRS contains details of the information system through which appropriate software can be developed. The information systems are also viewed perceptively with different pragmatics like work, work process or usecase. The usecase is one of the prime perspective views whose sum forms the information system. In this paper, an attempt is made to abstract object class, object class attributes, object methods, interrelationships between object classes and starting with/ending with actor from unformatted, unstructured SRS text. This is achieved through our own developed classing and slicing techniques to identify respectively the class structure & object methods. Then usecase is developed through the interrelationships between object methods of different classes and start with or end with the actor. The stages involve moulding the SRS, designing control flow graph for the SRS & data flow table for the SRS statements, developing appropriate classing and slicing criteria, creating actor & actors’ interface attributes table, create slicing criteria for each usecase and then slice relevant statements for each usecase. Here, we have attempted to use Weiser modified algorithm 1 to abstract exact usecase. The slicing criterion is obtained through the intersection of actor's interface attributes and the referenced/defined attributes present between two consecutive actors. Attempts have been made to resolve synonyms & heteronyms present in the SRS. The correctness & completeness of the proposed methodology depends on the realization of actor & actors’ interface attributes. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
17. MRAI Optimization for BGP Convergence Time Reduction without Increasing the Number of Advertisement Messages.
- Author
-
Alabdulkreem, E.A., Al-Raweshidy, H.S., and Abbod, M.F.
- Subjects
BGP (Computer network protocol) ,COMPUTER systems ,COMPUTER science ,COMPUTER programming ,COMPUTER security - Abstract
The primary cause for the slowness of the Border Gateway Protocol (BGP) convergence delay is the Minimum Route Advertisement Interval (MRAI). The MRAI is a timer with a default value of 30 seconds, which forces the BGP routers to wait for at least that amount of time before sending an advertisement for the same prefix. This process can delay important BGP advertisements. To date, there has been no specific value used by all the networks around the Internet. This paper aims to find the optimum value for the MRAI timer that maximally reduces the convergence time without increasing the number of advertisement messages. The optimal MRAI value founded by this paper reduced the convergence time by minimum 45%. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
18. Automated Business Rules Transformation into a Persistence Layer.
- Author
-
Cemus, Karel, Cerny, Tomas, and Donahoo, Michael J.
- Subjects
BUSINESS information services ,COMPUTER systems ,COMPUTER programming ,COMPUTER science ,COMPUTER security - Abstract
Enterprise Information Systems maintain data with respect to various business processes. These processes consist of business operations restricted by business rules expressed as preconditions and post-conditions. Each rule must be considered and enforced throughout the system, from user interface to persistence storage. Such rule evaluation in multiple contexts results in both significant rule restatement and high maintenance complexity, as there is no single focal point for capturing and reusing these rules. In this paper, we apply the Aspect-Oriented Design Approach to the persistence layer to simplify business rules management, enforce business rules throughout the system and consequently decrease development and maintenance efforts. Our preliminary results show that it is possible to define business rules in a single place and then apply them automatically in a persistence layer. We retrieve data sets restricted by given operation post-conditions with respect to current execution context without any manual rule restatement. This paper provides a small case study emphasizing the benefits and future challenges. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
19. Second-Order Spline-Wavelet Robust Code under Non-Uniform Codeword Distribution.
- Author
-
Levina, Alla and Taranov, Sergey
- Subjects
WAVELET transforms ,COMPUTER systems ,COMPUTER programming ,COMPUTER science ,COMPUTER security - Abstract
In computer science, robustness is the ability of a computer system to cope with errors during execution. Robust codes are new nonlinear systematic error detecting codes that provide uniform protection against all errors, whereas classical linear error detection code detects only a certain class of errors. Therefore, defence by the linear codes can be ineffective in many channels and environments, when error distribution is unknown. The probability of error masking can increase depending on codeword distribution. However, mapping the most probable codewords to a predefined set can reduce the maximum of the error masking distribution. The algorithm proposed in this paper is based on the second-order wavelet decomposition of B-splines under non-uniform nets. In this paper, we propose a general approach to the algorithm construction of spline-wavelet decompositions of linear space over an arbitrary field. This approach is based on the generalization of calibration relations and functional systems, which are biorthogonal to basic systems of relevant space. The obtained results permit the construction of second-order spline-wavelet robust code. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
20. On the Evolution of Solution Spaces Triggered by Emerging Technologies.
- Author
-
Salado, Alejandro and Nilchiani, Roshanak
- Subjects
COMPUTER software ,COMPUTER systems ,COMPUTER science ,SOFTWARE engineering ,COMPUTER software development - Abstract
The term solution space is widely used in the engineering community; yet there is little known about their evolution. Theoretical research in the field of systems science indicates that requirements can only reduce the solution space. Yet, some authors state that on the contrary requirements can be used to expand to or open new solution spaces. Furthermore, some practitioners defend that the requirement to use a previously nonexistent technology would actually increase the solution space or move it to a new area, while others state that more requirements make life more difficult. Who is right then? The present paper provides initial answers to this question using systems theory. In order to achieve this, the present paper differentiates between various types of solutions spaces, which depend on the systems they include. Finally, the paper provides practical examples to showcase the results of the theoretical findings within real contexts. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
21. Costing for an Autonomous Future: A Discussion on Estimation for Unmanned Autonomous Systems.
- Author
-
Ryan, Thomas R. and Valerdi, Ricardo
- Subjects
COMPUTER software development ,COMPUTER software ,COMPUTER science ,SOFTWARE engineering ,COMPUTER systems - Abstract
This paper will produce three areas of discussion in the field of cost estimation for unmanned autonomous systems. First, this paper will propose a common definition of an Unmanned Autonomous system. Second, it will introduce a method to estimate the cost of unmanned autonomous systems utilizing existing parametric cost estimation tools: SEER – HDR, COCOMO II, COSYSMO, and two relationships – weight and performance. The third discussion will focus on challenges surrounding autonomy. To address these challenges from a cost perspective, this paper recommends modifications to parameters within COCOMO II – via the use of object oriented function points in lieu of current methods, and COSYSMO – via the introduction of two cost driving parameters, namely, TVED and HRI-T. Finally, in summary this paper will identify areas of further research. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
22. Fractional mathematical modeling of the Stuxnet virus along with an optimal control problem.
- Author
-
Kumar, Pushpendra, Govindaraj, V., Erturk, Vedat Suat, Nisar, Kottakkaran Sooppy, and Inc, Mustafa
- Subjects
FIXED point theory ,MATHEMATICAL models ,COMPUTER systems ,CYBERTERRORISM ,COMPUTER science - Abstract
In this digital, internet-based world, it is not new to face cyber attacks from time to time. A number of heavy viruses have been made by hackers, and they have successfully given big losses to our systems. In the family of these viruses, the Stuxnet virus is a well-known name. Stuxnet is a very dangerous virus that probably targets the control systems of our industry. The main source of this virus can be an infected USB drive or flash drive. In this research paper, we study a mathematical model to define the dynamical structure or the effects of the Stuxnet virus on our computer systems. To study the given dynamics, we use a modified version of the Caputo-type fractional derivative, which can be used as an old Caputo derivative by fixing some slight changes, which is an advantage of this study. We demonstrate that the given fractional Caputo-type dynamical model has a unique solution using fixed point theory. We derive the solution of the proposed non-linear non-classical model with the application of a recent version of the Predictor–Corrector scheme. We analyze various graphs at different values of the arrival rate of new computers, damage rate, virus transmission rate, and natural removal rate. In the graphical interpretations, we verify the values of fractional orders and simulate 2-D and 3-D graphics to understand the dynamics clearly. The major novelty of this study is that we formulate the optimal control problem and its important consequences both theoretically and mathematically, which can be further extended graphically. The main contribution of this research work is to provide some novel results on the Stuxnet virus dynamics and explore the uses of fractional derivatives in computer science. The given methodology is effective, fully novel, and very easy to understand. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
23. Optimising Kafka for stream processing in latency sensitive systems.
- Author
-
Wiatr, Roman, Słota, Renata, and Kitowski, Jacek
- Subjects
DATA management ,JAVA programming language ,COMPUTER systems ,COMPUTER science ,COMPUTER networks - Abstract
Abstract Many problems, like recommendation services, sensor networks, anti-crime protection, sophisticated AI services, need online data processing coming from the environment in the form of data streams consisting of events. The novelty of the approach in the field of stream processing lies in a synergistic effort toward optimization of such systems and additionally needed client components working as a whole. Building a message passing system for gathering information from mission-critical systems can be beneficial, but it is required to pay close attention to the impact it has on these systems. In this paper, we present the Apache Kafka optimization process for usage Kafka as a messaging system in latency sensitive systems. We propose a set of performance tests that can be used to measure Kafka impact on the system and performance test results of KafkaProducer Java API. KafkaProducer has almost no impact on system overall latency and it has a severe impact on resource consumption in terms of CPU. Optimising Kafka for stream processing in latency sensitive systems we reduce KafkaProducer negative impact by 75%. The tests are performed on an isolated production system. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
24. Markov Chain Analysis of Agent-Based Evolutionary Computing in Dynamic Optimization.
- Author
-
Byrski, Aleksander and Schaefer, Robert
- Subjects
MARKOV processes ,COMBINATORIAL optimization ,COMPUTER systems ,COMPUTATIONAL complexity ,COMPUTER science ,MULTIAGENT systems - Abstract
Abstract: In this paper a Markov model for Evolutionary Multi-Agent System is recalled. The model allows to study dynamic features of the computation and increases understanding the considered classes of systems by e.g., proving the ergodicity of the Markov chain modelling EMAS. This feature may be considered as a reason to use such complex techniques, as following the Michael Vose's approach, similar feature is proven for EMAS, showing that this system is able to reach any possible state of the system space (including of course optima sought). The main contribution of the paper is showing possibilities of applying the already proposed model to dynamic optimization problems. The impact of these enhancements on the ergodicity feature is also discussed. [Copyright &y& Elsevier]
- Published
- 2013
- Full Text
- View/download PDF
25. Arguing Security of Generic Avionic Mission Control Computer System (MCC) using Assurance Cases.
- Author
-
Poreddy, Bhanuchander Reddy and Corns, Steven
- Subjects
COMPUTER systems ,COMPLEXITY (Philosophy) ,COMPUTER security ,AVIONICS ,SYSTEMS engineering ,COMPUTER science - Abstract
Abstract: An assurance case is a body of evidence organized into an argument demonstrating that some claim about a system holds, i.e., is assured. Assurance cases are used to comment about system safety and it serves as a mean to show that the systems acceptably satisfy their safety properties. Assurance cases perform rigorous security analysis on safety-critical complex systems. In this paper, the analysis done is an approach to documenting an assurance case for system security, i.e., a security assurance case. The paper deals with the Assurance cases for Generic Avionic Mission Control Computer system, by constructing tangible claims and investigating potential vulnerabilities. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
26. EXTENSION OF IEC'S GENERIC DATA ACCESS WITH A LOCKING MECHANISM.
- Author
-
Vukmirovic, S., Erdeljan, A., Lendak, I., and Capko, D.
- Subjects
GENERIC programming (Computer science) ,COMPUTER interfaces ,USER interfaces ,COMPUTER science ,COMPUTER systems ,COMPUTER multitasking ,ELECTRONIC data processing ,DATABASE management ,DATABASES - Abstract
This paper proposes an extension of IEC's Generic Data Access standard (GDA) which allows safer multi-user work. This is achieved by locking the resources and/or resources associated with them before starting to edit them. Locking can eliminate data inconsistencies and conflicts generated in multi-user environments and thereby allow faster data entering. Another important benefit of this solution is the implicit locking of larger portions of the model. The GDA defines a set of simple interfaces allowing access to data from any data model. The new set of functions proposed in this paper allows clients to lock and unlock resources, get the list of locked resources or check whether a specific resource is locked. The paper describes a solution that is universal and configurable, allowing locking rules to be changed without changing the source code of the GDA server. The new interface is implemented and tested in industry environment. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
27. Tagged-Sub Optimal Code (TSC) Compression for Improving Performance of Web Services.
- Author
-
Rassan, Iehab Al and Alyahya, Haifa
- Subjects
WEB services ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
Compression can be used to reduce the size of files and speeding up the transmission time over networks. However, not all compression techniques have the same features and capabilities to improve the performance of transmission over networks. This paper shows a comparison between different compression algorithms in order to improve the performance of web-services over the Internet. Nowadays, Service Oriented Architecture (SOA) being used heavily between applications as interaction between loosely coupled services, which are function independently. Therefore, fast and efficient services offered through the web services are needed. Enhancing performance of web services, would improve overall system's performance. As a result, compressing and reducing the size of SOAP messages traveling over the network, improves the webs-service performance. This paper compares the performance of web-services by compressing SOAP messages using Tagged Sub-optimal Code (TSC) and Huffman Encoding Algorithms. Experimental results show that web-services compressed using TSC speeds up the performance of web-services compared to normal web-services and web-services compressed using Huffman encoding. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
28. New Undecidability Results for Properties of Term Rewrite Systems.
- Author
-
Verma, Rakesh
- Subjects
ALGORITHMS ,REWRITING systems (Computer science) ,COMPUTER systems ,MACHINE theory ,MATHEMATICAL models ,COMPUTER science - Abstract
Abstract: This paper is on several basic properties of term rewrite systems: reachability, joinability, uniqueness of normal forms, unique normalization, confluence, and existence of normal forms, for subclasses of rewrite systems defined by syntactic restrictions on variables. All these properties are known to be undecidable for the general class and decidable for ground (variable-free) systems. Recently, there has been impressive progress on efficient algorithms or decidability results for many of these properties. The aim of this paper is to present new results and organize existing ones to clarify further the boundary between decidability and undecidability for these properties. Another goal is to spur research towards a complete classification of these properties for subclasses defined by syntactic restrictions on variables. The proofs of the presented results may be intrinsically interesting as well due to their economy, which is partly based on improved reductions between some of the properties. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
29. AGAPIA v0.1: A Programming Language for Interactive Systems and Its Typing System.
- Author
-
Dragoi, Cezara and Stefanescu, Gheorghe
- Subjects
PROGRAMMING languages ,ELECTRONIC data processing ,COMPUTER programming ,INTERACTIVE computer systems ,COMPUTER systems ,COMPUTER science - Abstract
Abstract: A model (consisting of rv-systems), a core programming language (for developing rv-programs), several specification and analysis techniques appropriate for modeling, programming and reasoning about interactive computing systems have been recently introduced by Stefanescu using register machines and space-time duality, see [Stefanescu, G. Interactive systems with registers and voices. Fundamenta Informaticae 73 (2006), 285–306. (Early draft, School of Computing, National University of Singapore, July 2004.)]. After that, Dragoi and Stefanescu have developed structured programming techniques for rv-systems and their verification, see, e.g., [Dragoi, C., and G. Stefanescu. Structured programming for interactive rv-systems. Institute of Mathematics of the Romanian Academy, IMAR Preprint 9/2006, Bucharest 2006. Dragoi, C., and G. Stefanescu. Towards a Hoare-like logic for structured rv-programs. Institute of Mathematics of the Romanian Academy, IMAR Preprint 10/2006, Bucharest, 2006. Dragoi, C., and G. Stefanescu. Implementation and verification of ring termination detection protocols using structured rv-programs. Annals of University of Bucharest, Mathematics-Informatics Series, 55 (2006), 129–138. Dragoi, C., and G. Stefanescu. Structured interactive programs with registers and voices and their verification. Draft, Bucharest, January 2007. Dragoi, C., and G. Stefanescu. On compiling structured interactive programs with registers and voices. In: “Proc. SOFSEM 2008,” 259–270. LNCS 4910, Springer, 2008.]. In the present paper a kernel programming language AGAPIA v0.1 for interactive systems is introduced. The language contains definitions for complex spatial and temporal data, arithmetic and boolean expressions, modules, and while-programming statements with their temporal, spatial, and spatio-temporal versions. In AGAPIA v0.1 one can write programs for open processes located at various sites and having their temporal windows of adequate reaction to the environment. The main technical part of the paper describes a typing system for AGAPIA v0.1 programs. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
30. Validating for Liveness in Hidden Adversary Systems.
- Author
-
Mukherjee, Saikat, Srinivasa, Srinath, and D, Satish Chandra
- Subjects
INTERACTIVE computer systems ,MACHINE theory ,COMPUTER systems ,COMPUTER science ,COMPUTER programming - Abstract
Abstract: Multi-stream interactive systems can be seen as “hidden adversary” systems (HAS), where the observable behaviour on any interaction channel is affected by interactions happening on other channels. One way of modelling HAS is in the form of a multi-process I/O automata, where each interacting process appears as a token in a shared state space. Constraints in the state space specify how the dynamics of one process affects other processes. We define the “liveness criterion” of each process as the end objective to be achieved by the process. The problem now for each process is to achieve this objective in the face of unforeseen interferences from other processes. In an earlier paper, it was proposed that this uncertainty can be mitigated by collaboration among the disparate processes. Two types of collaboration philosophies were also suggested: altruistic collaboration and pragmatic collaboration. This paper addresses the HAS validation problem where processes collaborate altruistically. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
31. Services and Contracts: Coalgebraically.
- Author
-
Sun, Meng
- Subjects
COMPUTER systems ,COMPUTER architecture ,COMPUTER interfaces ,AXIOMS ,COMPUTER science - Abstract
Abstract: The popularity of service-oriented computing has not been accompanied by the necessary formalization of the notions being involved. This paper focuses on the development of a coalgebraic framework to support service-oriented application design. In this paper, the concepts are separated into three hierarchies – interfaces, contracts and services. Interfaces are specified by functors, and services are shown to be coalgebras of such functors, which should satisfy the axioms given in corresponding contracts. Different interfaces, contracts and services are related respectively by the morphisms between them. And the notion of bisimulation for services is derived from service morphisms, which captures the observational equivalence of services. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
32. The Case for Fairness of Trust Management.
- Author
-
Wierzbicki, Adam
- Subjects
COMPUTER security ,COMPUTER systems ,UTILITIES (Computer programs) ,SYSTEMS software ,COMPUTER science - Abstract
Abstract: All trust management systems must take into account the possibility of error: of misplaced trust. Therefore, regardless of whether it uses reputation or not, is centralized or distributed, a trust management system must be evaluated with consideration for the consequences of misplaced or abused trust. Thus, the issue of fairness has always been implicitly considered in the design and evaluation of trust management systems. This paper attempts to show that an implicit consideration, using the utilitarian paradigm of maximizing the sum of agents'' utilities, is insufficient. Two case studies presented in the paper concern the design of a new reputation systems that uses implicit and emphasized negative feedbacks, and the evaluation of reputation systems'' robustness to discrimination. The case studies demonstrate that considering fairness explicitly leads to different trust management system design and evaluation. Trust management systems can realize a goal of system fairness, identified with distributional fairness of agents'' utilities. The realization of this goal can be achieved in a laboratory setting when all other factors that affect utilities can be excluded, and where the system can be tested using modeled adversaries. Taking the fairness of agent behavior explicitly into account when building trust or distrust can help to realize the goal of fairness of trust management systems. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
33. Handling Model Changes: Regression Testing and Test-Suite Update with Model-Checkers.
- Author
-
Fraser, Gordon, Aichernig, Bernhard K., and Wotawa, Franz
- Subjects
COMPUTER science ,ELECTRONICS ,COMPUTER programming ,COMPUTERS ,COMPUTER systems - Abstract
Abstract: Several model-checker based methods to automated test-case generation have been proposed recently. The performance and applicability largely depends on the complexity of the model in use. For complex models, the costs of creating a full test-suite can be significant. If the model is changed, then in general the test-suite is completely regenerated. However, only a subset of a test-suite might be invalidated by a model change. Creating a full test-suite in such a case would therefore waste time by unnecessarily recreating valid test-cases. This paper investigates methods to reduce the effort of recreating test-suites after a model is changed. This is also related to regression testing, where the number of test-cases necessary after a change should be minimized. This paper presents and evaluates methods to identify obsolete test-cases, and to extend any given test-case generation approach based on model-checkers in order to create test-cases for test-suite update or regression testing. [Copyright &y& Elsevier]
- Published
- 2007
- Full Text
- View/download PDF
34. Time-awareness and Proactivity in Models of Interactive Computation.
- Author
-
Motus, Leo, Meriste, Merik, and Dosch, Walter
- Subjects
ELECTRONIC systems ,COMPUTER systems ,COMPUTER science ,CYBERNETICS - Abstract
Abstract: The paper discusses explicit properties and the requirements that are to be verified, imposed upon software-intensive systems by their environment and by their users. Those systems are time-critical, may contain autonomous components, and may exhibit proactive behaviour. It is suggested that the analysis and verification of properties in software-intensive systems requires time-aware model of interactive computation. The authors of this paper claim that hitherto used time interpretation in computer science is too simplified, and several simultaneously maintained independent time counting systems is a necessary precondition for timing analysis of interactions. A feature space for comparing the existing approaches to interactive computing is suggested, and a potential candidate for time-aware model of interactive computation is discussed. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
35. Formal Specification and Verification of Multi-Agent Systems.
- Author
-
Bourahla, Mustapha and Benmohamed, Mohamed
- Subjects
COMPUTER systems ,INFORMATION theory ,COMPUTER science ,SEMANTICS - Abstract
Abstract: Multi-agent systems are increasingly complex, and the problem of their verification and validation is acquiring increasing importance. In this paper we show how a well known and effective verification technique, model checking, can be generalized to deal with multi-agent systems. This paper explores a particular type of multi-agent system, in which each agent is viewed as having the three mental attitudes of belief (B), desire (D), and intention (I). We use a multi-modal branching-time logic BDI
CTL , with a semantics that is grounded in traditional decision theory and a possible-worlds framework. A preliminary implementation of the approach shows promising results. [Copyright &y& Elsevier]- Published
- 2005
- Full Text
- View/download PDF
36. A New Competitive Intelligence-based Strategy for Web Page Search.
- Author
-
Rasekh, Iman
- Subjects
SEARCH engine optimization ,COMPUTER security ,COMPUTER programming ,COMPUTER systems ,COMPUTER science - Abstract
Search Engine Optimization (SEO) is a collection of techniques that allow a site to get more traffic from search engines. Page Ranking is the fundamental concept of SEO and defines as a weighted number that represent the relative importance of the page based on the number of inbound and outbound links. In this paper, I proposed a new type of web page search which is based on the competitive intelligence. It use link-based ranking evolutionary scheme to accommodate users’ preferences. I implemented the prototype system and demonstrate the feasibility of the proposed web page search scheme. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
37. Gaussian Mixture Distribution Analysis as Estimation of Probability Density Function and it's the Periphery.
- Author
-
Tsukakoshi, Kiyoshi and Ida, Kenichi
- Subjects
GAUSSIAN mixture models ,COMPUTER security ,COMPUTER programming ,COMPUTER systems ,COMPUTER science - Abstract
In statistics, Mixture distribution model is a stochastic model for a measured data set to express existence of the subpopulation in a population, without requiring that the subpopulation to whom each observational data belongs should be identified. Formally, Mixture distribution model is equivalent to expressing the probability distributions of observational data in a population. However, it is although it is related to the problem relevant to Mixture distribution pulling out a population's characteristic out of subpopulation. Mixture distribution model is used without subpopulation's identity information in order to make the statistical inference about the characteristic of the subpopulation who was able to give only the observational data about a population simultaneously. This paper considered these matters from the similarity of the linear combination of an element function with estimation problem of a density function which used the Kernel function, and estimation problem of the density function using a Spline function. How to take Translate in arrangement of knots of estimation problem of the density function using the method of Bandwidth picking in estimation problem of the density function using a Kernel function and a Spline function and Wavelets analysis and Scale has a related thing. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
38. A Web based System for Cricket Talent Identification, Enhancement and Selection (C-TIES).
- Author
-
Ahamad, Gulfam, Naqvi, S. Kazim, Beg, M.M. Sufyan, and Ahmed, Tanvir
- Subjects
CRICKET (Sport) ,COMPUTER security ,COMPUTER programming ,COMPUTER systems ,COMPUTER science - Abstract
Cricket is an extremely popular game. More than a million cricketers play cricket daily in India alone and aspire to become professional cricketers. Cricket Talent identification and enhancement is a challenging problem due to lack of quality coaches, meagre infrastructural facilities, and poor linkages of coaching academies & cricket authorities. In India, the problem is even tougher as the majority of the population resides in villages. Many of talented players do not get timely recognition of cricket boards’ authorities thus amounting to waste of talent. Many keep on pursuing cricket despite of being non-talented. Due to lack of application of appropriate scientific methods the selection process is also criticized as biased by many. In this paper, we present a web based system viz. Cricket Talent Identification, Enhancement and Selection (C-TIES) for addressing the above issues. C-TIES utilize a cricket talent knowledgebase of experts’ opinions aggregated by applying OWA Aggregation Operator and Relative Fuzzy Linguistic Quantifier (RFLQ). The C-TIES system classifies the cricket talent level of an enthusiast into five different classes by applying Normalized Adequacy Coefficient (NAC). The Talent Enhancement and Talent Selection subsystems also uses appropriate algorithms based on OWA, RFLQ and NAC to respectively enable identification of weaknesses in a player and select most talented n-players from a larger group of players. Thus, system reduces the time for identifying weaknesses and also provides a relatively better unbiased selection method for short listing players. The system has been developed using Struts 2.0, Hibernate, J2EE, Ajax, MySql are used. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
39. Developing a Novel Holistic Taxonomy of Security Requirements.
- Author
-
Rjaibi, Neila and Rabai, Latifa Ben Arfa
- Subjects
TAXONOMY ,COMPUTER security ,COMPUTER programming ,COMPUTER systems ,COMPUTER science - Abstract
Defining security requirements is of prime importance for all systems; we usually study the generic ones like confidentiality, integrity, availability, authentication, non-repudiation and privacy. It is thus imperative to evaluate all the possible extended requirements. A literature review has shown that there are various and different security requirements models, some of which are examined and others are neglected. Moreover, security lacks a unified taxonomy of security requirements. In this paper, we refer to the variety of security requirements models from the literature to drive an aggregate model and move away from the individualistic proposed taxonomy to a hierarchical and standard security requirement model. We define and propose a novel and holistic security requirement taxonomy at two levels of abstraction that incorporates 13 basic and standard requirements and then refined in layers into 31 security requirements sub-factors. These requirements are discussed in the open literature. Our taxonomy offers a good understanding of security constraints relevant to the system functions in the field of computer security. When it comes to decision making, it is recommended to establish which security measures should be relevant to the entire security requirements taxonomy. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
40. A Distinctive Genetic Approach for Test-Suite Optimization.
- Author
-
Jeyaprakash, Srividhya and Alagarsamy, K.
- Subjects
COMPUTER software testing ,COMPUTER programming ,COMPUTER systems ,COMPUTER science ,COMPUTER security - Abstract
Software Testing is one of the expensive activities in Software development lifecycle but it used in to assure the quality of the Software. One of the important focuses in software engineering process is lessen the effort of software testing and cut down the cost & time of testing. This paper presents a unified method for reducing the test cases by adopting the hypothesis of Genetic algorithm such as crossover, mutation and crowding distance to elucidate the given problem and results in the minimum set of test cases with the ability to cover all the faults in minimum time. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
41. ANN Based Robotic Arm Visual Servoing Nonlinear System.
- Author
-
Al-Junaid, Hessa
- Subjects
NONLINEAR systems ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
The paper presents an ANN-based mechanism to model inter-related visual kinematics relations, which are an important fragment of visual servo closed loop system. The primary goal is to visually servo a 6-DOF robot arm, from an initial location toward a desired posture using only image data from a scene provided during the arm maneuver. Arm movements, dynamics, and kinematics were simulated with Matlab Robotics Tool Box. The arm is equipped with a CCD camera, where the visual CCD part is modeled and simulated with Matlab Epipolar Geometry Toolbox. The main issue that hinders visual servo system is related to the variant feature Jacobian matrix. The methodology used is based on the concept of integrating a multilayer ANN with an Image Based Visual Servoing system. ANN was used to learn, approximate, and map the highly complex relations that relate a scene movements to an arm movement through a visual servo. The proposed technique has proven to be an effective approach, that has resulted in reducing computational time. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
42. Predicting Number of Faults in Software System using Genetic Programming.
- Author
-
Rathore, Santosh S. and Kumar, Sandeep
- Subjects
GENETIC programming ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
In software development perspective, dealing with software faults is a vital and foremost important task. Presence of faults not only reduces the quality of the software, but also increases its development cost. A large number of models have been presented in the past to predict the fault proneness of the software system. However, most of them provide inadequate information and thus make the task of fault prediction difficult. In this paper, we present an approach to predict the number of faults in the given software system using the Genetic Programming (GP). We validate the proposed approach using an experimental investigation where we use the fault datasets of the ten software projects available in the PROMISE data repository. The Error rate, Recall and Completeness of the fault prediction model are used to evaluate the performance of the proposed approach. The results show that GP based models have produced the significant results for the number of faults prediction. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
43. Jitter Buffer Modelling and Analysis for TDM over PSN.
- Author
-
Seshasayee, Usha Rani and Rathinam, Manivasakan
- Subjects
TIME division multiple access ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
Time Division Multiplexing (TDM) over Packet Switched Network (PSN) is a pseudo wire technology for emulating TDM Circuits over Packet Networks. Conceptually, the important ingredients of the above technology are to implement the following, in the PSN (i) Quality of Service (QoS) which is implemented through scheduling at the intermediate nodes that gives priority to packets containing’TDM’ payload (ii) timing and synchronization and (iii) scheduling in the jitter buffer for minimum output variance. Among these, this paper addresses (iii) as a first step assuming that the PSN provisions the “unacknowledged virtual circuit” (the main components of’virtual circuit’ are QoS and connection-oriented service). This work targets to implement a scheduling algorithm (service intervals) in jitter buffer at the receiver, such that the variance of inter-departure intervals of TDM stream is minimized. This is accomplished by the buffer modelled as M/G/1 queueing system with Auto-Regressive AR (1) correlations within service intervals. The motivation for the above correlation structure is two-fold. First, given the correlations within the service intervals, such a correlation results in reduction of variance in the inter-departure interval. The other is that the analysis of such a correlated queue is analytically tractable. The variance of the inter-departure time is presented. The analysis of the departure process, the waiting times of incoming packets of this correlated queue aids in determining the correlation parameter that are sub-optimal in the context of TDMoPSN. Our study also includes a M/G/1 queue with AR (1) cross-correlations between the inter-arrival and the service times. A G/G/1 queue in which the inter arrivals are correlated, and with AR (1) correlations of the above two types are also studied. Extensive simulations demonstrate our analytical and approximation results. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
44. Mobile Platform Detect and Alerts System for Driver Fatigue.
- Author
-
Abulkhair, Maysoon, Alsahli, Arwa H., Taleb, Kawther M., Bahran, Atheer M., Alzahrani, Fatimah M., Alzahrani, Hend A., and Ibrahim, Lamiaa Fattouh
- Subjects
MOBILE operating systems ,COMPUTER systems ,COMPUTER science ,COMPUTER programming ,COMPUTER security - Abstract
Transition state between being awake and asleep is called drowsiness. When driver is in this state he can cause accidents because his reaction time is slower, his attentiveness is reduced, and his information processing is less efficient. Driver Fatigue Detection System (called FDS) has been proposed by the authors in a recent work. The FDS aims to monitor the driver and the alertness to prevent them from falling asleep at the wheel. In the present paper, the FDS software is modified to be run in smartphone instead of Laptop which is very hard to fixed in a car and use all advantages of smartphone like camera and late weight. The proposed system will solve this problem by using a mobile phone camera; the phone will be put on a stand in the car to make the driver feels comfortable. The proposed system has hardware and software components such as mobile camera and Android SDK. Both components are integrated together to record real video for the driver, and then processing it for real-time eye tracking. This system has reserve all advantages in FDS like fast and real-time face and eye tracking, external illumination interference is limited, more robustness and accuracy allowance for fast head/face movement. The Main goals of this system are to ensure that the driver is staying awake during his drive, make the driver feels comfortable and to help decrease the number of accidents. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
45. Evolution of Modern Management through Taylorism: An Adjustment of Scientific Management Comprising Behavioral Science.
- Author
-
Uddin, Nasir and Hossain, Fariha
- Subjects
TAYLORISM (Management) ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
From traditional approach to scientific approach and then Scientific Management to Modern phase; methodology, principles and approaches have reached its current stage. Taylor, the originator of scientific management brought a revolution in the twentieth century by introducing scientific aspects of formulating patterns and disciplines within project management. Scientific management emphasizes on profit maximization by utilizing the workers through controlled mechanism, training, monetary incentives under managers, however it has been scrutinized and criticized highly for its short term focus on profit, treating workers as a machine like forms which eventually argued to result negative performance in the long run. Therefore, a drift towards behavioral study emerged and social factors have been included to address the challenges which Taylor's method neglected. This paper through an extensive literature review showed that, the advancement of technology and globalization stimulated the modern management approach to adjust and complement the scientific management by supplementing the human factor and their contributions within an organization rather than substituting the traditional approach. Therefore together with productive activities and completion of defined tasks, a successful modern day project management model highly values employee contribution and feedbacks at all level. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
46. Arabic Ontology Model for Financial Accounting.
- Author
-
Hegazy, A., Sakre, M., and Khater, Eman
- Subjects
ACCOUNTING ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
Ontologies have become core components of many large applications. It's part of the Semantic Web framework. Many disciplines now develop standardized ontologies that domain experts can use to share and annotate information in their fields. Arabic is still not well supported. Designing and developing Arabic Ontologies need more than what is provided by keyword-based and more than analysing the morphology and grammar of the traditional Arabic language A few works on Arabic Semantic Web applications are derived from the traditional Arabic language and not in the specialized science. However, available search engines supporting Arabic language are typically limited to keyword searches and do not take in consideration the underlying semantics of the content. In this research, we propose a model for representing Arabic financial accounting knowledge in the Computer Technology domain using Ontologies. This paper addresses an Arabic financial ontology and presents a methodology for creating ontologies based on declarative knowledge representation systems. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
47. Integration of Object Oriented Host Program with Network DBMS.
- Author
-
Handigund, Shivanand M., Makanur, Siddappa G., and Rao, M. Sreenivasa
- Subjects
DATABASES ,COMPUTER systems ,COMPUTER software ,COMPUTER science ,COMPUTER security - Abstract
Several mapping techniques are in use for the storage of objects in Network Database Management System (NDBMS). Though there is a generation gap between the evolution of NDBMS and Object Oriented Technology (OOT), both are either analogous or complementary to each other. Therefore the mapping technique to map class diagram onto Bachman diagram has been evolved. Host program accessing the database and accessing the independent data file may differ in the number and use of attributes and classes. Hence along with the mapping techniques, the implementation subsets of structural and behavioral aspect are to be considered. Moreover, the persistent closure (connected dependent objects) is to be maintained during the storage and retrieval of the objects. Thus, the mere mapping technique is not sufficient for the storage and retrieval of objects as the host program has to establish the relevancy of the database with respect to its authorized subset. In this paper, we have made an attempt to develop a guidelines to assist the programmer to determine the closure of every mapped class and accordingly to design the persistent constructor and loader for the storage and retrieval of the objects respectively. The persistent closure is determined by the type of interrelationship (degree of cohesion) between objects (classes) is identified. For illustrative portion of the class diagram depicting the business process and its implementation in the NDBMS, a table containing the implemented classes with their connected classes is prepared, and then the guidelines are proposed to integrate object oriented host program to access the database through NDBMS. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
48. A Review of Evaluation Methods and Techniques for Simulation Packages.
- Author
-
Alomair, Yasmeen, Ahmad, Iftikhar, and Alghamdi, Abdullah
- Subjects
COMPUTER simulation ,COMPUTER systems ,COMPUTER science ,COMPUTER programming ,COMPUTER security - Abstract
Given the varied benefits of using the simulation packages such as; cost efficiency, time, risk mitigation, greater insights and user friendliness of systems. It has been extensively used in many application domains such as defense, airports, manufacturing, engineering, and healthcare. Currently, due to the increasing numbers of simulation software packages available in the market, it is often challenging to choose a suitable package that meets users’ demands. Researchers have leveraged many assessment methods and techniques to facilitate the evaluation and selection process of the simulation packages. This paper provides a survey corroborated from researcher's contributions to simplify the selection process of simulation packages. Also, discerns the evaluation methods and techniques for simulation packages found in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
49. A Programming Environment for Visual Block-Based Domain-Specific Languages.
- Author
-
Kurihara, Azusa, Sasaki, Akira, Wakita, Ken, and Hosobe, Hiroshi
- Subjects
PROGRAMMING languages ,COMPUTER systems ,COMPUTER science ,COMPUTER programming ,COMPUTER security - Abstract
Visual block-based programming is useful for various users such as novice programmers because it provides easy operations and improves the readability of programs. Also, in programming education, it is known to be effective to initially present basic language features and then gradually make more advanced features available. However, the cost of implementing such visual block-based languages remains a challenge. In this paper, we present a programming environment for providing visual block-based domain- specific languages (visual DSLs) that are translatable into various programming languages. In our environment, programs are built by combining visual blocks expressed in a natural language. Blocks represent program elements such as operations and variables. Tips represent snippets, and macro blocks represent procedures. Using Tips and macros make code more abstract, and reduce the number of blocks in code. Visual DSLs can be a front-end for various languages. It can be easily restricted and extended by adding and deleting blocks. We applied our programming environment to Processing, an educational programming language for media art. We show that the environment is useful for novice programmers who learn basic concepts of programming and the features of Processing. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
50. OWA based Book Recommendation Technique.
- Author
-
Sohail, Shahab Saquib, Siddiqui, Jamshed, and Ali, Rashid
- Subjects
INTERNET ,COMPUTER systems ,COMPUTER science ,COMPUTER programming ,COMPUTER security - Abstract
The proliferation of the modern technologies has caused data overload over the Internet. The huge data over the World Wide Web has increased the problems for the users to extract the exact information. Variousrecommendation techniques are being used to help the customers in purchasing the desired items for online shopping. In this paper, we propose a recommendation method for books. We use positional aggregation based scoring (PAS) technique to score the books recommended by top ranked universities and assigned weights to these scores using fuzzy quantifiers. We apply Ordered Weighted Averaging (OWA) aggregation operator over these scores to find the top books. Finally top ranked books are recommended. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.