69 results on '"Yamany A"'
Search Results
2. Network Slicing and Management
- Author
-
Luis M. Contreras and Sameh Yamany
- Subjects
Computer science ,Distributed computing ,Slicing - Published
- 2021
3. Exploiting resiliency for Kernel-wise CNN approximation enabled by adaptive hardware Design
- Author
-
Andre Guntoro, Cecilia De la Parra, Taha Soliman, Akash Kumar, Ahmed El-Yamany, and Norbert Wehn
- Subjects
Quantization (physics) ,Computer engineering ,Contextual image classification ,Computer science ,Design space exploration ,Kernel (statistics) ,Computation ,Residual ,Convolutional neural network ,Communication channel - Abstract
Efficient low-power accelerators for Convolutional Neural Networks (CNNs) largely benefit from quantization and approximation, which are typically applied layer-wise for efficient hardware implementation. In this work, we present a novel strategy for efficient combination of these concepts at a deeper level, which is at each channel or kernel. We first apply layer-wise, low bit-width, linear quantization and truncation-based approximate multipliers to the CNN computation. Then, based on a state-of-the-art resiliency analysis, we are able to apply a kernel-wise approximation and quantization scheme with negligible accuracy losses, without further retraining. Our proposed strategy is implemented in a specialized framework for fast design space exploration. This optimization leads to a boost in estimated power savings of up to 34% in residual CNN architectures for image classification, compared to the base quantized architecture.
- Published
- 2021
- Full Text
- View/download PDF
4. Correction to: A Tri-level Programming Framework for Modelling Attacks and Defences in Cyber-Physical Systems
- Author
-
Nour Moustafa, Waleed Yamany, and Benjamin Turnbull
- Subjects
Software framework ,Computer science ,business.industry ,Cyber-physical system ,Software engineering ,business ,computer.software_genre ,computer - Published
- 2020
5. Characterizing the Performance of Interstate Flexible Pavements Using Artificial Neural Networks and Random Parameters Regression
- Author
-
Matthew Volovski, Anwaar Ahmed, Mohamed S. Yamany, and Tariq Usman Saeed
- Subjects
Artificial neural network ,Computer science ,021105 building & construction ,Agency (sociology) ,0211 other engineering and technologies ,Econometrics ,020101 civil engineering ,02 engineering and technology ,Variation (game tree) ,Random parameters ,Regression ,0201 civil engineering ,Civil and Structural Engineering - Abstract
Past studies developed pavement performance models using data from all or multiple states across the United States. This study hypothesized that due to variation in agency practices and wor...
- Published
- 2020
6. A Tri-level Programming Framework for Modelling Attacks and Defences in Cyber-Physical Systems
- Author
-
Nour Moustafa, Waleed Yamany, and Benjamin Turnbull
- Subjects
Exploit ,Computer science ,020209 energy ,020208 electrical & electronic engineering ,Evolutionary algorithm ,Cyber-physical system ,02 engineering and technology ,computer.software_genre ,Computer security ,Software framework ,Electric power system ,Genetic algorithm ,0202 electrical engineering, electronic engineering, information engineering ,Benchmark (computing) ,Resource allocation ,computer - Abstract
Smart power grids suffer from coordinated attacks that exploit both their physical and cyber layers. It is an inevitable requirement to study power systems under such complex hacking scenarios to discover the system vulnerabilities and protect against those vulnerabilities and their attack vectors with appropriate defensive actions. This paper proposes an efficient tri-level programming framework that dynamically determines attacking scenarios, along with the best defensive actions in cyber-physical systems. A tri-level optimisation framework is proposed to fit the optimal decisions of defence strategies, malicious vectors and their operators for optimising the unmet demand while launching hacking behaviours. Defending resource allocation is designed using an evolutionary algorithm to examine coordinated attacks that exploit power systems. The proposed framework includes a Genetic Algorithm (GA) to solve each of the model levels in power systems. This proposed framework can flexibly model malicious vectors and their defences. IEEE 14-bus benchmark is employed to evaluate the proposed framework.
- Published
- 2020
7. FACTORS AFFECTING CONSTRUCTION PLANNING
- Author
-
Hossam Eldin Hosni, Eman Fathi El-Din Ahmed, and Ahmed El Yamany
- Subjects
Project planning ,Process management ,Computer science ,Delphi method ,Construction planning ,computer ,Delphi ,computer.programming_language - Abstract
Construction planning CP can be considered as one of the most important phases in construction projects. This research aims to find the most important Factors affecting Construction Planning (CP). Those factors identified using the following two steps. First, collecting factors affecting CP through a comprehensive literature review. Second, identifying the most important planning factors using Delphi technique. This research helps practitioner engineers to create successful plans by focusing on specific factors affecting project planning.
- Published
- 2018
8. Robust Defect Pixel Detection and Correction for Bayer Imaging Systems
- Author
-
Noha A. El-Yamany
- Subjects
010302 applied physics ,Pixel ,Computer science ,business.industry ,0103 physical sciences ,Computer vision ,02 engineering and technology ,Artificial intelligence ,021001 nanoscience & nanotechnology ,0210 nano-technology ,business ,01 natural sciences - Published
- 2017
9. Hybrid Approach to Incorporate Preventive Maintenance Effectiveness into Probabilistic Pavement Performance Models
- Author
-
Dulcy M. Abraham and Mohamed S. Yamany
- Subjects
symbols.namesake ,Markov chain ,Computer science ,Probabilistic logic ,symbols ,Pavement maintenance ,Markov process ,Transportation ,Hybrid approach ,Preventive maintenance ,Civil and Structural Engineering ,Reliability engineering - Abstract
Various methodologies are being developed to build and improve probabilistic pavement performance models that have high prediction capabilities. However, the effectiveness of preventive mai...
- Published
- 2021
10. An Innovative Approach for Attribute Reduction Using Rough Sets and Flower Pollination Optimisation
- Author
-
Gerald Schaefer, Eid Emary, Waleed Yamany, Shao Ying Zhu, and Aboul Ella Hassanien
- Subjects
Computer science ,Computational intelligence ,02 engineering and technology ,Machine learning ,computer.software_genre ,Reduction (complexity) ,Search algorithm ,Pattern recognition ,0202 electrical engineering, electronic engineering, information engineering ,General Environmental Science ,Fitness function ,business.industry ,Dominance-based rough set approach ,020206 networking & telecommunications ,Maxima and minima ,attribute reduction ,Benchmark (computing) ,General Earth and Planetary Sciences ,flower pollination optimisation ,020201 artificial intelligence & image processing ,rough sets ,Rough set ,Artificial intelligence ,Data mining ,business ,computer - Abstract
Optimal search is a major challenge for wrapper-based attribute reduction. Rough sets have been used with much success, but current hill-climbing rough set approaches to attribute reduction are insufficient for finding optimal solutions. In this paper, we propose an innovative use of an intelligent optimisation method, namely the flower search algorithm (FSA), with rough sets for attribute reduction. FSA is a relatively recent computational intelligence algorithm, which is inspired by the pollination process of flowers. For many applications, the attribute space, besides being very large, is also rough with many different local minima which makes it difficult to converge towards an optimal solution. FSA can adaptively search the attribute space for optimal attribute combinations that maximise a given fitness function, with the fitness function used in our work being rough set-based classification. Experimental results on various benchmark datasets from the UCI repository confirm our technique to perform well in comparison with competing methods.
- Published
- 2016
- Full Text
- View/download PDF
11. A Generic Approach CNN-Based Camera Identification for Manipulated Images
- Author
-
Ahmed El-Yamany, Hossam Fouad, Masoud Alghoniemy, and Youssef Raffat
- Subjects
021110 strategic, defence & security studies ,Demosaicing ,Computer science ,business.industry ,0211 other engineering and technologies ,System identification ,02 engineering and technology ,Convolutional neural network ,Multiplexer ,Identification (information) ,Robustness (computer science) ,Feature (computer vision) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Computer vision ,Artificial intelligence ,Applications of artificial intelligence ,business - Abstract
Camera model identification has been attracting a lot of attention lately, as a powerful forensic method. With the promising breakthroughs in the artificial intelligence applications, such systems were revisited to increase the expected accuracy or to solve the still persisting deadlocks. One of the most still-to-be-solved dilemmas is the image manipulations effect on the overall accuracy of the identification systems. A huge degradation in the performance is noticed, when images are post-processed using commonly used methods as compression, scaling and contrast enhancement. Using the state of the art Convolutional Neural Network (CNN) architecture proposed by Bayar et al to estimate the manipulation parameters, and dedicated feature extractor models to estimate the source camera. Multiplexers are used to shift the input image between the dedicated models through the output of the CNNs. Our proposed methods significantly outperform state of the art methods in the literature, especially in case of heavy compression and down sampling. The images used for testing were extracted from 10 different cameras, including different models from the same manufacturer. Different devices were used to investigate the methodology robustness. Moreover, such generic approach could revolutionary change the whole design methodology for camera model identification systems.
- Published
- 2018
12. A Generic Approach CNN-Based Camera Identification for Manipulated Images
- Author
-
Youssef Raffat, Ahmed El-Yamany, and Hossam Fouad
- Subjects
business.industry ,Computer science ,020208 electrical & electronic engineering ,Feature extraction ,System identification ,02 engineering and technology ,Convolutional neural network ,Identification (information) ,Digital image ,Robustness (computer science) ,Feature (computer vision) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Computer vision ,Artificial intelligence ,Applications of artificial intelligence ,business - Abstract
Camera model identification has been attracting a lot of attention lately, as a powerful forensic method. With the promising breakthroughs in the artificial intelligence applications, such systems were revisited to increase the expected accuracy or to solve the still persisting deadlocks. One of the most still-to-be-solved dilemmas is the image manipulations effect on the overall accuracy of the identification systems. A huge degradation in the performance is noticed, when images are post-processed using commonly used methods as compression, scaling and contrast enhancement. Using the state of the art Convolutional Neural Network (CNN) architecture proposed by Bayar et al to estimate the manipulation parameters, and dedicated feature extractor models to estimate the source camera. Multiplexers are used to shift the input image between the dedicated models through the output of the CNNs. Our proposed methods significantly outperform state of the art methods in the literature, especially in case of heavy compression and down sampling. The images used for testing were extracted from 10 different cameras, including different models from the same manufacturer. Different devices were used to investigate the methodology robustness. Moreover, such generic approach could revolutionary change the whole design methodology for camera model identification systems.
- Published
- 2018
13. A Novel Adaptive Golay Correlator Synchronizer for IEEE 802.11ad Indoor mmWave Systems
- Author
-
Markus Petri and Ahmed El-Yamany
- Subjects
Scheme (programming language) ,IEEE 802 ,Signal-to-noise ratio ,Binary Golay code ,Synchronizer ,Adaptive algorithm ,Orthogonal frequency-division multiplexing ,Computer science ,Electronic engineering ,computer ,Synchronization ,computer.programming_language - Abstract
IANovelAdaptiveGolayCorrelatorSynchronizer.11ad WLAN system is proposed. The proposed design is based upon the well-known Golay Correlator (GC) synchronizer. The GC synchronizer performance is degraded in the low Signal to Noise Ratio (SNR) regime to a fixed threshold. In our proposed scheme, an adaptive algorithm optimizes the threshold of the GC so that it becomes SNR-independent. Our simulation results show that the proposed scheme greatly outperforms the traditional GC based synchronizers in different indoor environments indicating a revolutionary change in the synchronizer design not limited to the IEEE 802.11ad.
- Published
- 2018
14. An Adaptive IEEE 802.11ad Indoor mmWave Inner-Receiver Architecture
- Author
-
Ahmed El-Yamany and Markus Petri
- Subjects
Signal-to-noise ratio ,Synchronizer ,Binary Golay code ,Adaptive algorithm ,Computer science ,Orthogonal frequency-division multiplexing ,Electronic engineering ,Signal ,Throughput (business) ,Computer Science::Information Theory ,Communication channel - Abstract
IAnAdaptiveIEEE802.11ad WLAN system is proposed. The proposed design is a modified version of the well-known Golay Correlator (GC) based inner receiver. The GC synchronizer and channel estimator performance is degraded in the low Signal to Noise Ratio (SNR) regime due to a fixed threshold. Furthermore, higher throughput may be achieved as a result of an early SNR range indication. In our proposed scheme, an adaptive algorithm optimizes the threshold of the GC based blocks so that it becomes SNR-independent. Moreover, it gives an indication of the SNR of the received signal in an early stage. Signal path is adapted accordingly in several blocks, achiveing higher accuracy at a higher throughput rate. Our simulation results show that the proposed scheme outperforms the traditional GC based inner receivers in different indoor environments.
- Published
- 2018
15. Multi-Objective Gray-Wolf Optimization for Attribute Reduction
- Author
-
Eid Emary, Aboul Ella Hassanien, Vaclav Snasel, and Waleed Yamany
- Subjects
business.industry ,Computer science ,Particle swarm optimization ,Swarm behaviour ,Pattern recognition ,Feature selection ,Mutual information ,computer.software_genre ,Multi-Objective ,Attribute Reduction ,Robustness (computer science) ,Genetic algorithm ,General Earth and Planetary Sciences ,Artificial intelligence ,Data mining ,Multi-swarm optimization ,business ,Gray-Wolf ,Classifier (UML) ,computer ,General Environmental Science - Abstract
Feature sets are always dependent, redundant and noisy in almost all application domains. These problems in The data always declined the performance of any given classifier as it make it difficult for the training phase to converge effectively and it affect also the running time for classification at operation and training time. In this work a system for feature selection based on multi-objective gray wolf optimization is proposed. The existing methods for feature selection either depend on the data description; filter-based methods, or depend on the classifier used; wrapper approaches. These two main approaches lakes of good performance and data description in the same system. In this work gray wolf optimization; a swarm-based optimization method, was employed to search the space of features to find optimal feature subset that both achieve data description with minor redundancy and keeps classification performance. At the early stages of optimization gray wolf uses filter-based principles to find a set of solutions with minor redundancy described by mutual information. At later stages of optimization wrapper approach is employed guided by classifier performance to further enhance the obtained solutions towards better classification performance. The proposed method is assessed against different common searching methods such as particle swarm optimization and genetic algorithm and also was assessed against different single objective systems. The proposed system achieves an advance over other searching methods and over the other single objective methods by testing over different UCI data sets and achieve much robustness and stability.
- Published
- 2015
16. Coverage Closure Efficient UVM Based Generic Verification Architecture for Flash Memory Controllers
- Author
-
Sameh El-Ashry, Khaled Salah, and Ahmed El-Yamany
- Subjects
0209 industrial biotechnology ,Computer science ,business.industry ,Interface (Java) ,Process (computing) ,NAND gate ,02 engineering and technology ,Memory controller ,Flash memory ,020901 industrial engineering & automation ,Application-specific integrated circuit ,Embedded system ,0202 electrical engineering, electronic engineering, information engineering ,Design process ,020201 artificial intelligence & image processing ,business ,Wishbone - Abstract
Memory controllers are stated as the backbone ofdiverse architectures in the ASIC world. Among many concerns inenhancing the performance of the memory controllers is thetremendous verification process that consumes time, effort andresources. This paper proposes an optimized generic universalverification methodology (UVM) architecture to verify the flashmemory controllers. The architecture built is based on a surveyabout the main flash memory controllers architecture typesincluding Flex-One NAND, Open NAND Flash Interface (ONFI), Embedded Multi-Media Card (e.MMC), Universal Flash Storage(UFS) and the SD-CARD memory controller examined with–opensource wishbone(WB) interface. Introducing an optimizedsolution for most of memory controllers verification environmentsis a great challenge owing to the harshness in building and reusingresources, the numerous protocols that the verifier should beaware of and the high number of iterations to reach full functionalcoverage. The generic environment offers several advantages, especially regarding the number of tests and sequences developedto achieve full coverage. The generic environment also providesthe versatility of using pre-developed UVM architectures thateventually contribute in achieving much less developing time forthe whole design process. Throughout the architecture, we will beusing new techniques and state-of-the-art developed blocks toachieve the highest coverage closure time as well as an innovativeway to build a reference model and how to efficiently utilize andaccelerate the scoreboard checking process.
- Published
- 2016
17. A novel assertion-based CAD tool for automatic extraction of functional coverage
- Author
-
Sameh El-Ashry, Hatem El-Kharashy, Ahmed El-Yamany, Khaled Salah, and Abdelrahman G. Abubakr
- Subjects
High-level verification ,Functional verification ,business.industry ,Computer science ,Real-time computing ,Runtime verification ,02 engineering and technology ,SystemVerilog ,020202 computer hardware & architecture ,Intelligent verification ,Regression testing ,0202 electrical engineering, electronic engineering, information engineering ,Verilog ,Software engineering ,business ,computer ,Software verification ,computer.programming_language - Abstract
Over the past decade, the increasing hardware design complexity uncovered the necessity for a justified and complete functional verification process. The implementation of a fast and reliable verification environment is both challenging and time-consuming. Accordingly, running the verification process in parallel with the design is a must as it also reduces the pressure of the time-to-market problem. Nowadays, one of the most important parameters the industry seeks in any verification process is the coverage closure, since it indicates when the verification process can stop. This paper proposes a new functional coverage environment based on regression tests and assertions for protocol checking and coverage of complex sequences, justified and tested with a novel CAD tool. The environment and the tool offer a variety of optimized features through a GUI interface that even designers might find it handy to assist in any long verification process required.
- Published
- 2016
18. Echoing the 'Generality Concept' through the Bus Functional Model Architecture in Universal Verification Environments
- Author
-
Ahmed El-Yamany
- Subjects
Flash (photography) ,Functional verification ,Interface (Java) ,business.industry ,Computer science ,Embedded system ,Memory architecture ,NAND gate ,Bus Functional Model ,business ,Memory controller ,Flash memory - Abstract
Verification acts as a major milestone in any semiconductorindustry. With the huge investment and time-to-marketpressure, the verification process must be expeditious and decisive. Since the introduction of the Universal Verification Environment(UVM) by Accellera in 2011, new techniques and packages aredeliberately added to facilitate the process of verification andgeneralize it as the UVM originally meant to act in order to, emphasize the reusability concept. This paper proposes a newlayer to be added in the UVM generic environments when thetheme of verification is the same for each intellectual property (IP) with only diversity in interfaces. We validate our concept througha generic verification environment for flash memory controllerswe proposed in a previous work. The paper uses the generalityconcept introduced to create an optimized generic (UVM) architecture to verify the main types of the flash memorycontrollers with much less effort and tediousness. This genericUVM architecture is developed to verify multiple protocols withgeneric-HOST-/Memory interfaces. The main type of memorycontrollers we are dealing with are the flash memory controllersas the Flex-One NAND, Open NAND Flash Interface–ONFI-,Embedded Multi-Media Card (e.MMC), Universal Flash Storage(UFS) and the SD-CARD memory controller examined with –opensource WB interface.
- Published
- 2016
19. A Multi-Agent Framework for Testing Distributed Systems
- Author
-
Hany F. El Yamany, Luiz Fernando Capretz, and Miriam A. M. Capretz
- Subjects
Test strategy ,FOS: Computer and information sciences ,Agent technology ,business.industry ,Computer science ,Multi-agent system ,White-box testing ,Distributed computing ,Software development ,System testing ,Software Engineering ,Software performance testing ,Distributed systems ,Software quality ,Software testing ,Software Engineering (cs.SE) ,Computer Science - Software Engineering ,Computer Science - Distributed, Parallel, and Cluster Computing ,Cloud testing ,Software construction ,Web application ,Software reliability testing ,Distributed, Parallel, and Cluster Computing (cs.DC) ,business ,System integration testing - Abstract
Software testing is a very expensive and time consuming process. It can account for up to 50% of the total cost of the software development. Distributed systems make software testing a daunting task. The research described in this paper investigates a novel multi-agent framework for testing 3-tier distributed systems. This paper describes the framework architecture as well as the communication mechanism among agents in the architecture. Web-based application is examined as a case study to validate the proposed framework. The framework is considered as a step forward to automate testing for distributed systems in order to enhance their reliability within an acceptable range of cost and time.
- Published
- 2015
20. Hybrid flower pollination algorithm with rough sets for feature selection
- Author
-
Eid Emary, Hossam M. Zawbaa, B. Parv, Aboul Ella Hassanien, and Waleed Yamany
- Subjects
education.field_of_study ,Fitness function ,Fitness approximation ,business.industry ,Computer science ,Population ,Particle swarm optimization ,Feature selection ,Pattern recognition ,Evolutionary computation ,Minimum redundancy feature selection ,Artificial intelligence ,Rough set ,business ,education ,Algorithm - Abstract
Flower pollination algorithm (FPA) optimization is a new evolutionary computation technique that inspired from the pollination process of flowers. In this paper, a model for multi-objective feature selection based on flower pollination algorithm (FPA) optimization hybrid with rough set is proposed. The proposed model exploits the capabilities of filter-based feature selection and wrapper-based feature selection. Filter-based approach can be described as data oriented methods that not directly related to classification performance. Wrapper-based approach is more related to classification performance but it does not face redundancy and dependency among the selected feature set. Therefore, we proposed a multi-objective fitness function that uses FPA to the find optimal feature subset. The multi-objective fitness function enhances classification performance and guarantees minimum redundancy among selected features. At begin of the optimization process, fitness function uses mutual information among feature as a goal for optimization. While at some later time and using the same population, the fitness function is switched to be more classifier dependent and hence exploits rough-set classifier as a guide to classification performance. The proposed model was tested on eight datasets form UCI data repository and proves advance over other search methods as particle swarm optimization (PSO) and genetic algorithm (GA).
- Published
- 2015
21. Moth-flame optimization for training Multi-Layer Perceptrons
- Author
-
Waleed Yamany, Mohammed Fawzy, Aboul Ella Hassanien, and Alaa Tharwat
- Subjects
Meta-optimization ,Computer science ,business.industry ,Ant colony optimization algorithms ,Particle swarm optimization ,Pattern recognition ,Perceptron ,Machine learning ,computer.software_genre ,Local optimum ,Genetic algorithm ,Artificial intelligence ,Multi-swarm optimization ,business ,Metaheuristic ,computer - Abstract
Multi-Layer Perceptron (MLP) is one of the Feed-Forward Neural Networks (FFNNs) types. Searching for weights and biases in MLP is important to achieve minimum training error. In this paper, Moth-Flame Optimizer (MFO) is used to train Multi-Layer Perceptron (MLP). MFO-MLP is used to search for the weights and biases of the MLP to achieve minimum error and high classification rate. Five standard classification datasets are utilized to evaluate the performance of the proposed method. Moreover, three function-approximation datasets are used to test the performance of the proposed method. The proposed method (i.e. MFO-MLP) is compared with four well-known optimization algorithms, namely, Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), and Evolution Strategy (ES). The experimental results prove that the MFO algorithm is very competitive, solves the local optima problem, and it achieves a high accuracy.
- Published
- 2015
22. New Rough Set Attribute Reduction Algorithm Based on Grey Wolf Optimization
- Author
-
Eid Emary, Aboul Ella Hassanien, and Waleed Yamany
- Subjects
Reduction (complexity) ,education.field_of_study ,Reduction strategy ,Hierarchy (mathematics) ,Computer science ,Heuristic (computer science) ,Population ,Crossover ,Rough set ,education ,Algorithm ,Evolutionary computation - Abstract
In this paper, we propose a new attribute reduction strategy based on rough sets and grey wolf optimization (GWO). Rough sets have been used as an attribute reduction technique with much success, but current hill-climbing rough set approaches to attribute reduction are inconvenient at finding optimal reductions as no perfect heuristic can guarantee optimality. Otherwise, complete searches are not feasible for even medium sized datasets. So, stochastic approaches provide a promising attribute reduction technique. Like Genetic Algorithms, GWO is a new evolutionary computation technique, mimics the leadership hierarchy and hunting mechanism of grey wolves in nature. The grey wolf optimization find optimal regions of the complex search space through the interaction of individuals in the population. Compared with GAs, GWO does not need complex operators such as crossover and mutation, it requires only primitive and easy mathematical operators, and is computationally inexpensive in terms of both memory and runtime. Experimentation is carried out, using UCI data, which compares the proposed algorithm with a GA-based approach and other deterministic rough set reduction algorithms. The results show that GWO is efficient for rough set-based attribute reduction.
- Published
- 2015
23. Intelligent security and access control framework for service-oriented architecture
- Author
-
Miriam A. M. Capretz, David S. Allison, and Hany F. El Yamany
- Subjects
Cloud computing security ,business.industry ,Computer science ,computer.internet_protocol ,Access control ,Service-oriented architecture ,Computer security model ,Computer security ,computer.software_genre ,Web application security ,Computer Science Applications ,Security service ,Security through obscurity ,business ,Software engineering ,SOA Security ,computer ,Software ,Information Systems - Abstract
One of the most significant difficulties with developing Service-Oriented Architecture (SOA) involves meeting its security challenges, since the responsibilities of SOA security are based on both the service providers and the consumers. In recent years, many solutions to these challenges have been implemented, such as the Web Services Security Standards, including WS-Security and WS-Policy. However, those standards are insufficient for the new generation of Web technologies, including Web 2.0 applications. In this research, we propose an intelligent SOA security framework by introducing its two most promising services: the Authentication and Security Service (NSS), and the Authorization Service (AS). The suggested autonomic and reusable services are constructed as an extension of WS-* security standards, with the addition of intelligent mining techniques, in order to improve performance and effectiveness. In this research, we apply three different mining techniques: the Association Rules, which helps to predict attacks, the Online Analytical Processing (OLAP) Cube, for authorization, and clustering mining algorithms, which facilitate access control rights representation and automation. Furthermore, a case study is explored to depict the behavior of the proposed services inside an SOA business environment. We believe that this work is a significant step towards achieving dynamic SOA security that automatically controls the access to new versions of Web applications, including analyzing and dropping suspicious SOAP messages and automatically managing authorization roles.
- Published
- 2010
24. A New Multi-layer Perceptrons Trainer Based on Ant Lion Optimization Algorithm
- Author
-
Waleed Yamany, Aboul Ella Hassanien, Mohammad F. Hassanin, Tai-hoon Kim, Alaa Tharwat, and Tarek Gaber
- Subjects
Meta-optimization ,Local optimum ,Artificial neural network ,Computer science ,Ant colony optimization algorithms ,Genetic algorithm ,MathematicsofComputing_NUMERICALANALYSIS ,Particle swarm optimization ,Perceptron ,ComputingMethodologies_ARTIFICIALINTELLIGENCE ,Metaheuristic ,Algorithm - Abstract
In this paper, Ant Lion Optimizer (ALO) was presented to train Multi-Layer Perceptron (MLP). ALO was used to find the weights and biases of the MLP to achieve a minimum error and a high classification rate. Four standard classification datasets were used to benchmark the performance of the proposed method. In addition, the performance of the proposed method were compared with three well-known optimization algorithms, namely, Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Ant Colony Optimization (ACO). The experimental results showed that the ALO algorithm with the MLP was very competitive as it solved the local optima problem and achieved a high accuracy rate.
- Published
- 2015
25. Attribute reduction approach based on modified flower pollination algorithm
- Author
-
Eid Emary, Aboul Ella Hassanien, Waleed Yamany, and Hossam M. Zawbaa
- Subjects
Meta-optimization ,Fitness function ,Computer science ,Feature vector ,Genetic algorithm ,Particle swarm optimization ,Multi-swarm optimization ,Metaheuristic ,Algorithm ,Evolutionary computation - Abstract
Attribute reduction approach is proposed in this paper based on a modified version of the flower pollination algorithm optimization (FPA). Flower pollination algorithm (FPA) is one of recently evolutionary computation technique, inspired by the pollination process of flowers. The modified FPA algorithm adaptively balance the exploration and exploitation to quickly find the optimal solution through using local searching with adaptive search diversity. The modified FPA can quickly search the feature space for optimal or near-optimal feature subset minimizing a given fitness function. The proposed fitness function used incorporate both classification accuracy and feature reduction size. The proposed system is applied on a eight dataset from the UCI machine learning data sets and proves a good performance in comparison with the genetic algorithm (GA) and particle swarm optimization (PSO), that commonly used in this context.
- Published
- 2015
26. Smart OptiSelect Preference Based Innovative Framework for User-in-the-Loop Feature Selection in Software Product Lines
- Author
-
Mohamed Shaheen Elgamel and Ahmed Eid El Yamany
- Subjects
Software ,Database ,business.industry ,Computer science ,Product (mathematics) ,Feature selection ,User-in-the-loop ,Data mining ,computer.software_genre ,business ,computer ,Preference - Published
- 2015
27. A Generalized Service Replication Process in Distributed Environments
- Author
-
Marwa F. Mohamed, Miriam A. M. Capretz, Katarina Grolinger, and Hany F. El Yamany
- Subjects
Service (systems architecture) ,service-oriented architecture ,computer.internet_protocol ,Computer science ,mobile computing ,Distributed computing ,Data_MISCELLANEOUS ,Mobile computing ,computer.software_genre ,service replication ,Data_FILES ,cloud ,replication process ,business.industry ,Quality of service ,Software Engineering ,Service-oriented architecture ,Electrical and Computer Engineering ,Service provider ,Replication (computing) ,quality of service ,Data as a service ,Web service ,business ,computer ,Other Computer Sciences ,Computer network - Abstract
Replication is one of the main techniques aiming to improve Web services’ (WS) quality of service (QoS) in distributed environments, including clouds and mobile devices. Service replication is a way of improving WS performance and availability by creating several copies or replicas of Web services which work in parallel or sequentially under defined circumstances. In this paper, a generalized replication process for distributed environments is discussed based on established replication studies. The generalized replication process consists of three main steps: sensing the environment characteristics, determining the replication strategy, and implementing the selected replication strategy. To demonstrate application of the generalized replication process, a case study in the telecommunication domain is presented. The adequacy of the selected replication strategy is demonstrated by comparing it to another replication strategy as well as to a non-replicated service. The authors believe that a generalized replication process will help service providers to enhance QoS and accordingly attract more customers.
- Published
- 2015
28. New approach for feature selection based on rough set and bat algorithm
- Author
-
Aboul Ella Hassanien, Waleed Yamany, and Eid Emary
- Subjects
Statistical classification ,Fitness function ,Feature (computer vision) ,Computer science ,business.industry ,Crossover ,Pattern recognition ,Feature selection ,Algorithm design ,Rough set ,Artificial intelligence ,business ,Bat algorithm - Abstract
This paper presents a new feature selection technique based on rough sets and bat algorithm (BA). BA is attractive for feature selection in that bats will discover best feature combinations as they fly within the feature subset space. Compared with GAs, BA does not need complex operators such as crossover and mutation, it requires only primitive and simple mathematical operators, and is computationally inexpensive in terms of both memory and runtime. A fitness function based on rough-sets is designed as a target for the optimization. The used fitness function incorporates both the classification accuracy and number of selected features and hence balances the classification performance and reduction size. This paper make use of four initialisation strategies for starting the optimization and studies its effect on bat performance. The used initialization reflects forward and backward feature selection and combination of both. Experimentation is carried out using UCI data sets which compares the proposed algorithm with a GA-based and PSO approaches for feature reduction based on rough-set algorithms. The results on different data sets shows that bat algorithm is efficient for rough set-based feature selection. The used rough-set based fitness function ensures better classification result keeping also minor feature size.
- Published
- 2014
29. OPTI-SELECT
- Author
-
Abdel Salam Sayyad, Mohamed Shaheen, and Ahmed Eid El Yamany
- Subjects
business.industry ,Computer science ,Search-based software engineering ,Feature selection ,User-in-the-loop ,computer.software_genre ,Machine learning ,Multi-objective optimization ,Feature model ,Software ,Feature (computer vision) ,Domain engineering ,Data mining ,Artificial intelligence ,business ,computer - Abstract
Opti-Select is an Interactive Multi-objective feature analysis and optimization tool for software product lines configuration and feature models optimization based on an innovative UIL (User-In-the-loop) idea. In this tool, the experience of system analysts and stakeholders are merged with optimization techniques and algorithms.Opti-Select interactive tool is an integrated set of techniques providing step by step feature model and attribute configuration, selecting and excluding features, solution set optimization, and user interaction utilities that can all together reach satisfactory set of solutions that fits stakeholder preferences.
- Published
- 2014
30. Developing a RDB-RDF management framework for interoperable web environments
- Author
-
Karim M. Fadel, Hany F. El Yamany, and Mahmoud H. Roshdy
- Subjects
World Wide Web ,WS-I Basic Profile ,Computer science ,SPARQL ,computer.file_format ,Web mapping ,Linked data ,Semantic Web Stack ,WS-Policy ,computer ,Semantic Web ,Data Web - Abstract
Recently, the mapping from Relational Databases (RDB) into Resource Description Framework (RDF) becomes an essential demand for most of the people who are consuming the Web in their daily activities and businesses due to the need for getting complete integrated, trustworthy and understandable data that achieving their requirements and wishes. However, management and controlling the RDB-RDF mapping process is a challengeable task because of the rapid change in the massive amount of data which is collected from a diverse huge number of providers and vendors. In this work, a RDB-RDF Management Framework is suggested to automatically manage and monitor the mapping process from relational to semantic data. The proposed framework includes a set of Data Handlers and Web services (WSs) in order to improve the Web environment interoperability and reliability. Specifically, three sets of services are established and consumed to handle and organize the mapping process. One set of the three suggested services is operating and acting like an intelligent agent to continuously sensing the change that might occur at the relational data storage side with respect to three different levels: schema, relation and data. Finally, the implementation and particular case study of the suggested framework are introduced and discussed.
- Published
- 2013
31. Evaluation of depth compression and view synthesis distortions in multiview-video-plus-depth coding systems
- Author
-
Kemal Ugur, Miska Hannuksela, Moncef Gabbouj, and Noha A. El-Yamany
- Subjects
Reference software ,business.industry ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Data_CODINGANDINFORMATIONTHEORY ,View synthesis ,Depth map ,Distortion ,Encoding (memory) ,Metric (mathematics) ,Computer vision ,Artificial intelligence ,business ,Reliability (statistics) ,Data compression - Abstract
Several quality evaluation studies have been performed for video-plus-depth coding systems. In these studies, however, the distortions in the synthesized views have been quantified in experimental setups where both the texture and depth videos are compressed. Nevertheless, there are several factors that affect the quality of the synthesized view. Incorporating more than one source of distortion in the study could be misleading; one source of distortion could mask (or be masked by) the effect of other sources of distortion. In this paper, we conduct a quality evaluation study that aims to assess the distortions introduced by the view synthesis procedure and depth map compression in multiview-video-plus-depth coding systems. We report important findings that many of the existing studies have overlooked, yet are essential to the reliability of quality evaluation. In particular, we show that the view synthesis reference software yields high distortions that mask those due to depth map compression, when the distortion is measured by average luma peak signal-to-noise ratio. In addition, we show what quality metric to use in order to reliably quantify the effect of depth map compression on view synthesis quality. Experimental results that support these findings are provided for both synthetic and real multiview-video-plus-depth sequences.
- Published
- 2010
32. Privacy and trust policies within SOA
- Author
-
Miriam A. M. Capretz, David S. Allison, and H.F. El Yamany
- Subjects
Service (systems architecture) ,Information privacy ,Privacy by Design ,Computer science ,Privacy software ,business.industry ,Privacy policy ,Data_MISCELLANEOUS ,Internet privacy ,Computer security ,computer.software_genre ,Metamodeling ,Element (criminal law) ,Architecture ,business ,computer - Abstract
Privacy for Service-Oriented Architecture (SOA) is required to gain the trust of those who would use the technology. Through the use of an independent privacy service (PS), the privacy policies of a service consumer and provider can be compared to create an agreed upon privacy contract. In this paper we further define a metamodel for privacy policy creation and comparison. A trust element is developed as an additional criterion for a privacy policy. We define the PS and what operations it must perform to accomplish its goals. We believe this PS combined with the presented metamodel provide a strong solution to providing privacy for SOA.
- Published
- 2009
33. QoSS Policies within SOA
- Author
-
David S. Allison, Hany F. El Yamany, M. Beatriz F. Toledo, Diego Zuquim Guimarães Garcia, and Miriam A. M. Capretz
- Subjects
Service (business) ,Authentication ,computer.internet_protocol ,Computer science ,media_common.quotation_subject ,Authorization ,Service-oriented architecture ,Computer security ,computer.software_genre ,Metadata ,Intelligent agent ,Security service ,Quality (business) ,computer ,XML ,media_common - Abstract
In this work, we present a metadata for Quality of Security Service (QoSS) for Service-Oriented Architecture, which supports the description of authentication, authorization and privacy features. The metadata is encapsulated by a QoSS service in order to assist the service consumer and provider to achieve a QoSS contract meeting both of their security requirements. This contract performs as an enforced policy for managing the interactions between the parties.
- Published
- 2009
34. Quality of Security Service for Web Services within SOA
- Author
-
H.F. El Yamany, Miriam A. M. Capretz, and David S. Allison
- Subjects
Service (business) ,business.industry ,Service delivery framework ,Computer science ,Service design ,Service level objective ,Application service provider ,Service level requirement ,Service provider ,Differentiated service ,Computer security ,computer.software_genre ,World Wide Web ,business ,computer - Abstract
Service-Oriented Architecture (SOA) is a paradigm for creating and encapsulating business processes in the form of loose-coupling, autonomous and abstracted services. Managing the non-functional requirements of SOA such as security, is an over arching problem due to the wide variety of ways the service consumer can access the services offered by the service provider and the equally varied restrictions the service provider can set for gaining access by the service consumer. In this work, we propose a metadata for quality of security service for SOA. The proposed metadata provides different levels to describe the available variations of the authentication, authorization and privacy features that are related to SOA security. A Web Service for Quality of Security Service (QoSS) is then constructed to encapsulate the suggested metadata in order to assist the service consumer and provider to achieve a QoSS agreement meeting both of their requirements. The QoSS agreement will perform as an enforced policy for managing the interactions between the service provider and consumer. The service of QoSS is located inside a complete framework for securing SOA.
- Published
- 2009
35. A Fine-Grained Privacy Structure for Service-Oriented Architecture
- Author
-
Miriam A. M. Capretz, H.F. El Yamany, and David S. Allison
- Subjects
Vocabulary ,Information privacy ,Markup language ,Database ,Privacy by Design ,computer.internet_protocol ,Privacy software ,Computer science ,business.industry ,media_common.quotation_subject ,Privacy policy ,XACML ,Access control ,Service-oriented architecture ,computer.software_genre ,World Wide Web ,Web service ,business ,computer ,XML ,media_common ,computer.programming_language - Abstract
In this paper we propose the creation and use of a privacy policy vocabulary. The elements of this policy will have criteria for comparison, creating hierarchical relationships between those elements that could not otherwise be directly compared. This policy vocabulary can be used in conjunction with the eXtensible Access Control Markup Language (XACML) to provide storage and enforcement.
- Published
- 2009
36. Use of Data Mining to Enhance Security for SOA
- Author
-
Miriam A. M. Capretz and Hany F. El Yamany
- Subjects
OASIS SOA Reference Model ,business.industry ,computer.internet_protocol ,Computer science ,Service-oriented architecture ,Web application security ,computer.software_genre ,Computer security ,World Wide Web ,Security service ,Mashup ,Data mining ,Web service ,business ,SOA Security ,computer ,Vulnerability (computing) - Abstract
Service-oriented architecture (SOA) is an architectural paradigm for developing distributed applications so that their design is structured on loosely coupled services such as Web services. One of the most significant difficulties with developing SOA concerns its security challenges, since the responsibilities of SOA security are based on both the servers and the clients. In recent years, a lot of solutions have been implemented, such as the Web services security standards, including WS-Security and WS-SecurityPolicy. However, those standards are completely insufficient for the promising new generations of Web applications, such as Web 2.0 and its upgraded edition, Web 3.0. In this work, we are proposing an intelligent security service for SOA using data mining to predict the attacks that could arise with SOAP (Simple Object Access Protocol) messages. Moreover, this service will validate the new security policies before deploying them on the service provider side by testing the probability of their vulnerability.
- Published
- 2008
37. An authorization model for Web Services within SOA
- Author
-
Miriam A. M. Capretz and H.F. El Yamany
- Subjects
Flexibility (engineering) ,Service (systems architecture) ,Computer access control ,Computer science ,computer.internet_protocol ,business.industry ,Access control ,Service-oriented architecture ,computer.software_genre ,World Wide Web ,Authorization certificate ,Web service ,Software architecture ,business ,computer - Abstract
One of the most significant difficulties with developing service-oriented architecture (SOA) concerns its security challenges. In particular, the authorization task is especially demanding because of the diverse access requirements within the various SOA environments, such as the business world, the academic setting and the industry atmosphere. In this paper, we propose a 4-attribute vector authorization model constructed in the form of an authorization service (AS). The proposed model is founded on the attribute-role based access control and contains an intelligent mining engine that facilitates the management of several authorization roles and provides flexibility to dynamically infer and assign the roles for a wide range of users and objects.
- Published
- 2008
38. Adaptive framework for robust high-resolution image reconstruction in multiplexed computational imaging architectures
- Author
-
Noha A. El-Yamany, P. Papamichalis, and Marc P. Christensen
- Subjects
Adaptive algorithm ,business.industry ,Computer science ,Anisotropic diffusion ,Materials Science (miscellaneous) ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Image processing ,Iterative reconstruction ,Similarity measure ,Industrial and Manufacturing Engineering ,Computational photography ,Optics ,Robustness (computer science) ,Computer Science::Computer Vision and Pattern Recognition ,Medical imaging ,Computer vision ,Artificial intelligence ,Business and International Management ,Image sensor ,business ,Reconstruction procedure - Abstract
In multiplexed computational imaging schemes, high-resolution images are reconstructed by fusing the information in multiple low-resolution images detected by a two-dimensional array of low-resolution image sensors. The reconstruction procedure assumes a mathematical model for the imaging process that could have generated the low-resolution observations from an unknown high-resolution image. In practical settings, the parameters of the mathematical imaging model are known only approximately and are typically estimated before the reconstruction procedure takes place. Violations to the assumed model, such as inaccurate knowledge of the field of view of the imagers, erroneous estimation of the model parameters, and/or accidental scene or environmental changes can be detrimental to the reconstruction quality, even if they are small in number. We present an adaptive algorithm for robust reconstruction of high-resolution images in multiplexed computational imaging architectures. Using robust M-estimators and incorporating a similarity measure, the proposed scheme adopts an adaptive estimation strategy that effectively deals with violations to the assumed imaging model. Comparisons with nonadaptive reconstruction techniques demonstrate the superior performance of the proposed algorithm in terms of reconstruction quality and robustness.
- Published
- 2008
39. Robust Color Image Superresolution: An Adaptive M-Estimation Framework
- Author
-
P. Papamichalis and Noha A. El-Yamany
- Subjects
Biometrics ,business.industry ,Color image ,Computer science ,lcsh:Electronics ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,lcsh:TK7800-8360 ,Pattern recognition ,Superresolution ,Norm (mathematics) ,Signal Processing ,Outlier ,Computer vision ,Artificial intelligence ,Electrical and Electronic Engineering ,business ,Information Systems - Abstract
This paper introduces a new color image superresolution algorithm in an adaptive, robust M-estimation framework. Using a robust error norm in the objective function, and adapting the estimation process to each of the low-resolution frames, the proposed method effectively suppresses the outliers due to violations of the assumed observation model, and results in color superresolution estimates with crisp details and no color artifacts, without the use of regularization. Experiments on both synthetic and real sequences demonstrate the superior performance over using the L2 and L1 error norms in the objective function.
- Published
- 2008
40. A memory and MHZ efficient EDMA transfer scheme for video encoding algorithms on TI TMS320DM642
- Author
-
Noha A. El-Yamany
- Subjects
Memory buffer register ,business.industry ,Computer science ,Image processing ,computer.file_format ,Data buffer ,Encoding (memory) ,Computer data storage ,Central processing unit ,Raster graphics ,business ,computer ,Algorithm ,Computer hardware ,Block (data storage) - Abstract
Video encoding algorithms require processing of data arranged in blocks of pixels. For efficient computation, pixel blocks are expected to be stored contiguously in memory, and within each block, pixels are to be arranged in a raster scan (left to right, top to bottom order). Since data captured from the video port is linearly arranged in memory (one line after the other), it is necessary to arrange the data in the two-dimensional form before processing for encoding. A common approach to achieve the two-dimensional arrangement is through optimized functions (in C or Assembly) to arrange the captured data, which is stored in an intermediate buffer, into an input buffer from which it is ready for encoding. However, this approach has two main drawbacks. First, a portion of the CPU MHZ budget is consumed only on data arrangement. Second, an intermediate data buffer is required to hold the data before the arrangement into the input buffer takes place, and hence increasing the memory requirements. In this paper, a memory and MHZ efficient EDMA transfer scheme is introduced for simultaneous data transfer and two-dimensional arrangement from the video port to the DSP memory. The proposed scheme is described in details for TI TMS320DM642TM.© (2008) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.
- Published
- 2008
41. An adaptive M-estimation framework for robust image super resolution without regularization
- Author
-
Noha A. El-Yamany and P. Papamichalis
- Subjects
Computer science ,business.industry ,Norm (mathematics) ,Outlier ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Computer vision ,Artificial intelligence ,Similarity measure ,business ,Algorithm ,Regularization (mathematics) ,Superresolution ,Image resolution - Abstract
This paper introduces a new image super-resolution algorithm in an adaptive, robust M-estimation framework. Super-resolution reconstruction is formulated as an optimization (minimization) problem whose objective function is based on a robust error norm. The effectiveness of the proposed scheme lies in the selection of a specific class of robust M-estimators, redescending M-estimators , and the incorporation of a similarity measure to adapt the estimation process to each of the low-resolution frames. Such a choice helps in dealing with violations to the assumed imaging model that could have generated the low-resolution frames from the unknown high-resolution one. The proposed approach effectively suppresses the outliers without the use of regularization in the objective function, and results in high-resolution images with crisp details and no artifacts. Experiments on both synthetic and real sequences demonstrate the superior performance over methods based on the L 2 and L 1 in the objective function.
- Published
- 2008
42. Using bounded-influence M-estimators in multi-frame super-resolution reconstruction: A comparative study
- Author
-
P. Papamichalis and N.A. El-Yamany
- Subjects
Pixel ,business.industry ,Computer science ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Estimator ,Image registration ,Iterative reconstruction ,Superresolution ,Robustness (computer science) ,Bounded function ,Outlier ,Computer vision ,Artificial intelligence ,business ,Algorithm ,Image resolution - Abstract
This paper introduces a comparative study of bounded-influence M-estimators in the context of multi-frame super-resolution reconstruction. The objectives of this study are to compare these estimators in terms of robustness, and to highlight the associated tradeoff between robustness and edge preservation (crispness) in the presence of registration errors and motion outliers.
- Published
- 2008
43. Adaptive object identification and recognition using neural networks and surface signatures
- Author
-
S.M. Yamany and A.A. Farag
- Subjects
Surface (mathematics) ,Artificial neural network ,Computer science ,Orientation (computer vision) ,business.industry ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Cognitive neuroscience of visual object recognition ,Representation (systemics) ,Image registration ,Pattern recognition ,Object (computer science) ,Adaptive filter ,Computer vision ,Artificial intelligence ,business ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
The paper introduces an adaptive technique for 3D object identification and recognition in 3D scanned scenes. This technique uses neural learning of the 3D free-form surface representation of the object in study. This representation scheme captures the 3D curvature information of any free-form surface and encodes it into a 2D image corresponding to a certain point on the surface. This image represents a "surface signature" because it is unique for this point and is independent of the object translation or orientation in space.
- Published
- 2004
44. A design of multiagent-based system for routing
- Author
-
Z.T. Fayed, H.M. Faheem, H.F. El-Yamany, and T.M. El-Areaf
- Subjects
Routing protocol ,Router ,business.industry ,Network packet ,Computer science ,Distributed computing ,Multi-agent system ,media_common.quotation_subject ,Routing table ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Intelligent decision support system ,computer.software_genre ,Network topology ,Intelligent agent ,business ,computer ,Internetworking ,Autonomy ,Computer network ,media_common - Abstract
The rapid development of internetworking and the need to deploy much more intelligent systems to speed up convergence times have brought intelligent agents in data communication domain. Intelligent agents are ideally qualified to knock the door of data communication applications, and this is due to their interactivity, autonomy, reactivity, and intelligence. A novel functional design of a multiagent-based router is described which can be deployed in internetworking. Slight modifications have been enforced to traditional packet formats of either distance-vector or link-state protocols to cope with the suggested system. Convergence time after topology changes for different-size and different-topology internetworks has been calculated. The addition of new router and its failure or link fail cases are examined thoroughly. A step towards a complete multiagent based system for internetworking and data communication is considered.
- Published
- 2003
45. Object recognition using neural networks and surface signatures
- Author
-
Aly A. Farag, Ahmed M. El-Bialy, and S.M. Yamany
- Subjects
Artificial neural network ,business.industry ,Computer science ,Orientation (computer vision) ,3D single-object recognition ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Representation (systemics) ,Cognitive neuroscience of visual object recognition ,Image registration ,Pattern recognition ,Curvature ,Object (computer science) ,Computer Science::Computer Vision and Pattern Recognition ,Computer vision ,Artificial intelligence ,business ,ComputingMethodologies_COMPUTERGRAPHICS ,Feature detection (computer vision) - Abstract
We present a concept for 3D free-form object recognition using a surface representation scheme. This representation scheme captures the 3D curvature information of any free-form surface and encodes it into a 2D image corresponding to a certain point on the surface. This image is unique for this point and is independent from the object translation or orientation in space. For this reason we called this image the surface point signature (SPS). Using specially designed neural networks; the SPS images are used in the matching and recognition of 3D objects in a 3D scanned scene.
- Published
- 2003
46. Integrating shape from shading and range data using neural networks
- Author
-
M.G.-H. Mostafa, S.M. Yamany, and Aly A. Farag
- Subjects
Artificial neural network ,Computer science ,business.industry ,Supervised learning ,3D reconstruction ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Pattern recognition ,Iterative reconstruction ,Backpropagation ,Extended Kalman filter ,Photometric stereo ,Feedforward neural network ,Artificial intelligence ,business ,ComputingMethodologies_COMPUTERGRAPHICS ,Sparse matrix - Abstract
This paper presents a framework for integrating multiple sensory data, sparse range data and dense depth maps from shape from shading in order to improve the 3D reconstruction of visible surfaces of 3D objects. The integration process is based on propagating the error difference between the two data sets by fitting a surface to that difference and using it to correct the visible surface obtained from shape from shading. A feedforward neural network is used to fit a surface to the sparse data. We also study the use of the extended Kalman filter for supervised learning and compare it with the backpropagation algorithm. A performance analysis is done to obtain the best neural network architecture and learning algorithm. It is found that the integration of sparse depth measurements has greatly enhanced the 3D visible surface obtained from shape from shading in terms of metric measurements.
- Published
- 2003
47. Integrating stereo and shape from shading
- Author
-
M.G.-H. Mostafa, Aly A. Farag, and S.M. Yamany
- Subjects
business.industry ,Computer science ,Machine vision ,3D reconstruction ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Pattern recognition ,Iterative reconstruction ,Photometric stereo ,Stereopsis ,Feedforward neural network ,Computer vision ,Artificial intelligence ,Shading ,business ,Surface reconstruction ,ComputingMethodologies_COMPUTERGRAPHICS - Abstract
This paper presents a new method for integrating different low level vision modules, stereo and shape from shading, in order to improve the 3D reconstruction of visible surfaces of objects from intensity images. The integration process is based on correcting the 3D visible surface obtained from shape from shading using the sparse depth measurements from the stereo module by fitting a surface into the difference between the two data sets. A feedforward neural network is used to fit a surface to the error difference. An extended Kalman filter algorithm is used for the network learning. It is found that the integration of sparse depth measurements has greatly enhanced the 3D visible surface obtained from shape from shading in terms of metric measurements.
- Published
- 2003
48. Multi-modal medical volumes fusion by surface matching
- Author
-
Aly A. Farag, S.M. Yamany, and Ayman M. Eldeib
- Subjects
Matching (statistics) ,Computer science ,business.industry ,Orientation (computer vision) ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Image registration ,Pattern recognition ,Maximization ,Mutual information ,Sensor fusion ,Six degrees of freedom ,Computer vision ,Artificial intelligence ,business ,Volume (compression) - Abstract
Presents a fast, six degrees of freedom, registration technique to accurately locate the position and orientation of medical volumes (e.g. CT, MRI) with respect to each other for the same patient. The technique uses surface registration and maximization of mutual information. We have developed a novel technique for surface registration which produces highly accurate results when registering two different volumes of the same individual generated from the same modality, such as preoperative MR and intraoperative MR volumes. In case surface registration is not able to accurately register the different volumes, the result is enhanced by multi-modal volume registration. The gain of this combination is to have an accurate alignment and to reduce time needed for registration. For the multi-modal volume registration, the maximization of mutual information (MI) as a matching criterion is used based on genetic algorithms (GA) as a search engine. Our results demonstrate that our registration technique allows for fast, accurate, robust and completely automatic registration of multimodality medical volumes.
- Published
- 2003
49. Parametric and non-parametric techniques for identifying images of F-actin distribution in endothelial cells with applied agonists
- Author
-
W.D. Ehringer, S.M. Yamany, Aly A. Farag, F.N. Miller, and K.J. Khiani
- Subjects
Artificial neural network ,Contextual image classification ,business.industry ,Computer science ,Feature extraction ,Feed forward ,Nonparametric statistics ,Pattern recognition ,computer.software_genre ,Bayes' theorem ,Statistical classification ,ComputingMethodologies_PATTERNRECOGNITION ,Artificial intelligence ,Data mining ,business ,computer ,Parametric statistics - Abstract
This research deals with developing automatic classification algorithms for identifying images of F-actin distribution in endothelial cells with different treatment of agonists. Parametric and non-parametric classification techniques were investigated such as statistical and artificial neural network classifiers. First and second order features were extracted from the images. Among the statistical classification techniques are the Bayes approach and the K-nearest neighbor (K-NN). For the neural network approach, we used the multilayer feedforward and the functional link network. All of these techniques provide adequate results with the neural methods performing with higher accuracy reaching above 97% classification.
- Published
- 2002
50. A system for human jaw modeling using intra-oral images
- Author
-
Sameh M. Yamany and Aly A. Farag
- Subjects
Machine vision ,Computer science ,business.industry ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Image registration ,Video camera ,Grid ,law.invention ,Photometric stereo ,Data acquisition ,law ,Computer vision ,Artificial intelligence ,business ,Camera resectioning - Abstract
A novel integrated system is developed to obtain a record of the patient's occlusion using computer vision. Data acquisition is obtained using intra-oral video camera. A modified Shape from Shading (SFS) technique using perspective projection and camera calibration is then used to extract accurate 3D information from a sequence of 2D images of the jaw. A novel technique for 3D data registration using Grid Closest Point (GCP) transform and genetic algorithms (GA) is used to register the output of the SFS stage. Triangulization is then performed, and a solid 3D model is obtained via a rapid prototype machine. The overall purpose of this research is to develop a model-based vision system for orthodontics that will replace traditional approaches and can be used in diagnosis, treatment planning, surgical simulation and implant purposes.
- Published
- 2002
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.