71 results on '"K. M. S. Soyjaudah"'
Search Results
2. Pattern representation using Neuroevolution of the augmenting topology (NEAT) on Keystroke dynamics features in Biometrics.
- Author
-
Purvashi Baynath, K. M. S. Soyjaudah, and Maleika Heenaye-Mamode Khan
- Published
- 2019
- Full Text
- View/download PDF
3. A modified Rough Mode Decision process for fast High Efficiency Video Coding (HEVC) intra prediction.
- Author
-
Kanayah Saurty, Pierre Clarel Catherine, and K. M. S. Soyjaudah
- Published
- 2015
- Full Text
- View/download PDF
4. Early CU size determination in HEVC intra prediction using Average Pixel Cost.
- Author
-
Kanayah Saurty, Pierre Clarel Catherine, and K. M. S. Soyjaudah
- Published
- 2014
- Full Text
- View/download PDF
5. Improving the Performance of NEAT Related Algorithm via Complexity Reduction in Search Space.
- Author
-
Heman Mohabeer and K. M. S. Soyjaudah
- Published
- 2013
- Full Text
- View/download PDF
6. Joint source channel decoding of circular non binary turbo codes.
- Author
-
Yogesh Beeharry, T. P. Fowdur, and K. M. S. Soyjaudah
- Published
- 2013
- Full Text
- View/download PDF
7. An unequal error protection scheme for SPIHT image transmission with prioritised retransmissions and de-noising.
- Author
-
T. P. Fowdur, Deevya Indoonundon, and K. M. S. Soyjaudah
- Published
- 2013
- Full Text
- View/download PDF
8. Application of predictive coding in the evolution of artificial neural network.
- Author
-
Heman Mohabeer and K. M. S. Soyjaudah
- Published
- 2012
- Full Text
- View/download PDF
9. Erasing Bit Nodes on the Bipartite Graph for Enhanced Performance of LDPC Codes.
- Author
-
P. C. Catherine and K. M. S. Soyjaudah
- Published
- 2011
- Full Text
- View/download PDF
10. Parallel Concatenation of LDPC Codes with Two Sets of Source Bits.
- Author
-
P. C. Catherine and K. M. S. Soyjaudah
- Published
- 2011
- Full Text
- View/download PDF
11. A comparative study of secret code variants in terms of keystroke dynamics.
- Author
-
Narainsamy Pavaday and K. M. S. Soyjaudah
- Published
- 2008
- Full Text
- View/download PDF
12. On the Logical Computational Complexity Analysis of Turbo Decoding Algorithms for the LTE Standards
- Author
-
Yogesh Beeharry, Tulsi Pawan Fowdur, and K. M. S. Soyjaudah
- Subjects
Computational complexity theory ,Computer science ,Binary number ,Contrast (statistics) ,020206 networking & telecommunications ,02 engineering and technology ,Variation (game tree) ,Computer Science Applications ,0202 electrical engineering, electronic engineering, information engineering ,Turbo code ,020201 artificial intelligence & image processing ,Electrical and Electronic Engineering ,Algorithm ,Decoding methods ,Efficient energy use - Abstract
Evaluating the computational complexity of decoders is a very important aspect in the area of Error Control Coding. However, most evaluations have been performed based on hardware implementations. In this paper, different decoding algorithms for binary Turbo codes which are used in LTE standards are investigated. Based on the different mathematical operations in the diverse equations, the computational complexity is derived in terms of the number of binary logical operations. This work is important since it demonstrates the computational complexity breakdown at the binary logic level as it is not always evident to have access to hardware implementations for research purposes. Also, in contrast to comparing different Mathematical operations, comparing binary logic operations provides a standard pedestal in view to achieve a fair comparative analysis for computational complexity. The usage of the decoding method with fewer number of binary logical operations significantly reduces the computational complexity which in turn leads to a more energy efficient/power saving implementation. Results demonstrate the variation in computational complexities when using different algorithms for Turbo decoding as well as with the incorporation of Sign Difference Ratio (SDR) and Regression-based extrinsic information scaling and stopping mechanisms. When considering the conventional decoding mechanisms and streams of 16 bits in length, Method 3 uses 0.0065% more operations in total as compared to Method 1. Furthermore, Method 2 uses only 0.0035% of the total logical complexity required with Method 1. These computational complexity analysis at the binary logical level can be further used with other error correcting codes adopted in different communication standards.
- Published
- 2021
13. Assessment and validation of global horizontal radiation: a case study in Mauritius
- Author
-
K. M. S. Soyjaudah and Yatindra Kumar Ramgolam
- Subjects
Insolation ,Meteorology ,Renewable Energy, Sustainability and the Environment ,business.industry ,020209 energy ,Photovoltaic system ,Site selection ,02 engineering and technology ,020401 chemical engineering ,Photovoltaics ,Financial evaluation ,0202 electrical engineering, electronic engineering, information engineering ,Environmental science ,0204 chemical engineering ,business - Abstract
Validated solar maps are very crucial for a holistic site selection, design and financial evaluation of photovoltaic (PV) projects. During this research, the temporal and spatial variations of glob...
- Published
- 2019
14. Performance of Hybrid Binary and Non-Binary Turbo Decoding Schemes for LTE and DVB-RCS Standards
- Author
-
Tulsi Pawan Fowdur, K. M. S. Soyjaudah, and Yogesh Beeharry
- Subjects
Amplitude modulation ,QAM ,Computer Networks and Communications ,Computer science ,Turbo code ,DVB-RCS ,Binary number ,Data_CODINGANDINFORMATIONTHEORY ,Electrical and Electronic Engineering ,Error detection and correction ,Algorithm ,Decoding methods ,Communication channel - Abstract
Binary and Non-Binary Turbo codes have been deployed in several digital communication standards to perform error correction. In order to enhance their error performance, several schemes such as Joint Source Channel Decoding (JSCD), extrinsic information scaling mechanisms, and prioritized QAM constellation mapping have been proposed for improving the error performance of error correcting codes. In this paper, hybrid schemes comprising of JSCD, regression based extrinsic information scaling, and prioritized 16-Quadrature Amplitude Modulation (QAM) with binary and non-binary Turbo codes have been presented. Significant improvement in error performance has been observed with the proposed scheme as compared to the conventional one. The hybrid scheme in the case of binary symmetric and asymmetric LTE Turbo codes, and triple binary Turbo codes outperform the conventional scheme by 0.8 dB on average. With duo-binary Turbo codes for the DVB-RCS standard, the hybrid scheme outperforms the conventional one with an average gain of 0.9 dB in BER performance.
- Published
- 2019
15. Enhanced MP3 transmission over Wi-Fi and LTE networks using unequal error protection and varying frequency transforms
- Author
-
Prateema Ragpot, K. M. S. Soyjaudah, and Tulsi Pawan Fowdur
- Subjects
lcsh:T58.5-58.64 ,lcsh:Information technology ,Computer Networks and Communications ,Computer science ,business.industry ,Quality of service ,unequal error protection ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Data_CODINGANDINFORMATIONTHEORY ,lcsh:Telecommunication ,Computer Science Applications ,statistical qam ,lte ,Transmission (telecommunications) ,lcsh:TK5101-6720 ,audio transmission ,Computer Science (miscellaneous) ,High bandwidth ,The Internet ,Electrical and Electronic Engineering ,business ,wi-fi ,Computer network - Abstract
Wi-Fi and LTE are commonly used for the transmission of high bandwidth data and multimedia over the internet. Providing a good quality of service in the transmission of such data over these wireless channels is challenging due to channel impairments such as noise and fading. This paper proposes three enhanced transmission schemes for audio over Wi-Fi and LTE. The first proposed scheme exploits the unequal importance of the bits generated by an MP3 codec to offer different levels of protection to them during transmission by mapping the important bits on prioritized QAM constellation bit positions. The second and third schemes use the statistical distribution of source symbols to map the bits of the encoded symbols to an SQAM constellation. For the second scheme, only systematic bits from the encoder are mapped onto the SQAM constellation. While for the third scheme, both the systematic and parity bits are mapped onto the SQAM constellation. A comparative analysis with different frequency transforms have been done. The simulation results show that the proposed schemes increase the system performance by 1–20 dB in segmented signal to noise ratio (SSNR). Using DWT further increases the gains up to 90 dB for 16-QAM at rate ½ as compared to FFT.
- Published
- 2019
16. Modelling the impact of spectral irradiance and average photon energy on photocurrent of solar modules
- Author
-
Yatindra Kumar Ramgolam and K. M. S. Soyjaudah
- Subjects
Photocurrent ,Photon ,Materials science ,Spectrometer ,Renewable Energy, Sustainability and the Environment ,business.industry ,020209 energy ,Irradiance ,02 engineering and technology ,Photon energy ,Bin ,law.invention ,Wavelength ,Optics ,law ,Solar cell ,0202 electrical engineering, electronic engineering, information engineering ,General Materials Science ,business - Abstract
Photocurrent generated by a solar cell depends on environmental conditions as well as electrical and technological parameters of the cell. During this research, a holistic assessment of the performance of solar cell technologies is performed. Measurement of the solar spectrum is carried out from sunrise to sunset using a calibrated spectrometer. Temporal variations in the solar spectrum, spectral content and average photon energy (APE) are discussed. A model is developed to assess the impact of spectral irradiance and temperature on photocurrent from planar pn junction cells. Effect of irradiance, temperature and average photon energy on the photocurrent generated by a poly-Si solar cell is simulated and analysed. Discussions on the performance of other cell technologies are also made. Results show that the APE was 1.94 eV for the 300–1050 nm bin, 1.91 eV for the 350–1050 nm bin and 1.84 eV for the 400–1050 nm wavelength bin of the AM 1.5 solar spectrum. At low intensity of light, the solar spectrum is rich in photons within near Red - IR wavelengths and hence the APE is 1.75 eV. The APE then increases proportionally with an increase in irradiance until a saturation level is reached at approximately 1.95 eV when irradiance exceeds 700 W/m2. Photocurrent is computed for varying spectral irradiance and temperature conditions. First, at low irradiance levels when the solar spectrum is rich in photons from Red-IR regions, the photocurrent is linearly proportional to the APE. Secondly, as the intensity of light increases and Visible-UV components of light increases, an exponential relationship between the photocurrent and APE is exhibited.
- Published
- 2018
17. Holistic performance appraisal of a photovoltaic system
- Author
-
Yatindra Kumar Ramgolam and K. M. S. Soyjaudah
- Subjects
Performance appraisal ,Annual production ,Performance ratio ,Meteorology ,Renewable Energy, Sustainability and the Environment ,020209 energy ,Statistical index ,Statistics ,Photovoltaic system ,0202 electrical engineering, electronic engineering, information engineering ,Environmental science ,02 engineering and technology ,Winter season - Abstract
During this research, performance of a 2.45 kWp PV system was evaluated over a period of 5 years. International electrotechnical committee (IEC) standard, IEC 61724: 1998 and a robust mathematical model were used as a guide for the purpose. Performance of the system was closely monitored during the dry winter season. Annual production decreased from 3463.8 kWh to 3370.9 kWh and the capacity utilisation factor decreased from 16.14% to 15.71% over the period. Monthly production was stochastic, but the average monthly production curve followed the same trend as incident global horizontal radiation with low production during dry winter months. Performance ratio was above 90% at the beginning of dry winter month, it then decreased to less than 70% after three months. A mathematical model based on five parameter model and one diode equation was then used to extract essential cell parameters and simulate performance of the system. Statistical indices were computed to assess performance of the model against measurement data. Hence the quality factor of the PV system was computed and was found to be between 60% and 100% with an average of 87% in the dry winter month.
- Published
- 2017
18. SYMBOL LEVEL DECODING FOR DUO-BINARY TURBO CODES
- Author
-
K. M. S. Soyjaudah, Tulsi Pawan Fowdur, and Yogesh Beeharry
- Subjects
General Computer Science ,Computational complexity theory ,Berlekamp–Welch algorithm ,Computer science ,Applied Mathematics ,General Chemical Engineering ,General Engineering ,List decoding ,Binary number ,Data_CODINGANDINFORMATIONTHEORY ,Sequential decoding ,Symbol (chemistry) ,lcsh:TA1-2040 ,Turbo code ,lcsh:Engineering (General). Civil engineering (General) ,Algorithm ,Decoding methods - Abstract
This paper investigates the performance of three different symbol level decoding algorithms for Duo-Binary Turbo codes. Explicit details of the computations involved in the three decoding techniques, and a computational complexity analysis are given. Simulation results with different couple lengths, code-rates, and QPSK modulation reveal that the symbol level decoding with bit-level information outperforms the symbol level decoding by 0.1 dB on average in the error floor region. Moreover, a complexity analysis reveals that symbol level decoding with bit-level information reduces the decoding complexity by 19.6 % in terms of the total number of computations required for each half-iteration as compared to symbol level decoding.
- Published
- 2017
19. An innovative multi-segment strategy for the classification of legal judgments using the k-nearest neighbour classifier
- Author
-
K. M. S. Soyjaudah, Sameerchand Pudaruth, and Rajendra Parsad Gunputh
- Subjects
Engineering ,Learning classifier system ,Span (category theory) ,Process (engineering) ,business.industry ,Confusion matrix ,Computational intelligence ,02 engineering and technology ,General Medicine ,computer.software_genre ,Variety (linguistics) ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Selection (linguistics) ,020201 artificial intelligence & image processing ,Artificial intelligence ,Data mining ,business ,computer ,Classifier (UML) ,Natural language processing - Abstract
The classification of legal documents has been receiving considerate attention over the last few years. This is mainly because of the over-increasing amount of legal information that is being produced on a daily basis in the courts of law. In the Republic of Mauritius alone, a total of 141,164 cases were lodged in the different courts in the year 2015. The Judiciary of Mauritius is becoming more efficient due to a number of measures which were implemented and the number of cases disposed of in each year has also risen significantly; however, this is still not enough to catch up with the increase in the number of new cases that are lodged. In this paper, we used the k-nearest neighbour machine learning classifier in a novel way. Unlike news article, judgments are complex documents which usually span several pages and contains a variety of information about a case. Our approach consists of splitting the documents into equal-sized segments. Each segment is then classified independently of the others. The selection of the predicted category is then done through a plurality voting procedure. Using this novel approach, we have been able to classify law cases with an accuracy of over 83.5%, which is 10.5% higher than when using the whole documents dataset. To the best of our knowledge, this type of process has never been used earlier to categorise legal judgments or other types of documents. In this work, we also propose a new measure called confusability to measure the degree of scatteredness in a confusion matrix.
- Published
- 2017
20. Adjacent Channel Interference for DVB-T at UHF Bands in the South of Mauritius for Summer Season.
- Author
-
Vinaye Armoogum, K. M. S. Soyjaudah, A. Jugurnauth, Nawaz Mohamudally 0001, and Terence C. Fogarty
- Published
- 2007
- Full Text
- View/download PDF
21. Adaptive IEEE 802.11i Security for Energy-Security Optimization.
- Author
-
M. Razvi Doomun and K. M. S. Soyjaudah
- Published
- 2007
- Full Text
- View/download PDF
22. Comparative Study of Path Loss Using Existing Models for Digital Television Broadcasting for Summer Season in the North of Mauritius.
- Author
-
Vinaye Armoogum, K. M. S. Soyjaudah, Nawaz Mohamudally 0001, and Terence C. Fogarty
- Published
- 2007
- Full Text
- View/download PDF
23. An Integrated Unequal Error Protection Scheme for the Transmission of Compressed Images with ARQ.
- Author
-
K. M. S. Soyjaudah and T. P. Fowdur
- Published
- 2006
- Full Text
- View/download PDF
24. Statistical Time Channel Evaluation of Array Codes in Block Coded Phase.
- Author
-
Mussawir Ahmad Hosany and K. M. S. Soyjaudah
- Published
- 2006
- Full Text
- View/download PDF
25. Joint Variable Length Source Codes and Convolutional Codes for Text.
- Author
-
Mussawir Ahmad Hosany and K. M. S. Soyjaudah
- Published
- 2006
- Full Text
- View/download PDF
26. A Complexity Study of Joint and Separate Huffman with Array Codes.
- Author
-
Mussawir Ahmad Hosany and K. M. S. Soyjaudah
- Published
- 2006
- Full Text
- View/download PDF
27. Terminating CU Processing in HEVC Intra-Prediction Adaptively Based on Residual Statistics
- Author
-
Pierre Clarel Catherine, Kanayah Saurty, and K. M. S. Soyjaudah
- Subjects
Normalization property ,Computer science ,Statistics ,Residual ,Coding tree unit ,Coding (social sciences) ,Data compression - Abstract
The current standard in video compression, High-Efficiency Video Coding (HEVC/H.265), provides superior compression performances compared to its H.264 predecessor. However, considerable increase in processing time is brought about with the large Coding Tree Unit (CTU) in H.265. In this paper, a method of terminating the Coding Unit (CU) earlier is proposed based on the luma residual statistics gathered during the encoding of the initial frames of the sequence. The gathered statistics are then formulated into thresholds adaptively and are used to overcome the unnecessary processing of potential CUs during subsequent frames. Experimental results obtained indicate that the encoding time can be reduced by 36.1% on average compared to HM16 along with a BD-Rate of only 0.29%.
- Published
- 2019
28. An Investigation of the TCP Meltdown Problem and Proposing Raptor Codes as a Novel to Decrease TCP Retransmissions in VPN Systems
- Author
-
Irfaan Coonjah, K. M. S. Soyjaudah, and Pierre Clarel Catherine
- Subjects
Focus (computing) ,business.industry ,Computer science ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Scalability ,Process (computing) ,Layer (object-oriented design) ,business ,Protocol (object-oriented programming) ,Queue ,Raptor code ,Computer network - Abstract
When TCP was designed, the protocol designers at this time did not cater for the problem of running TCP within itself and the TCP dilemma was not originally addressed. The protocol is meant to be reliable and uses adaptive timeouts to decide when a resend should occur. This design can fail when stacking TCP connections though, and this type of network slowdown is known as a “TCP meltdown problem.” This happens when a slower outer connection causes the upper layer to queue up more retransmissions than the lower layer is able to process. Some computer scientists designed a Virtual Private Networking product (OpenVPN) to accommodate problems that may occur when tunneling TCP within TCP. They designed the VPN to use UDP as the base for communication to increase the performance. But the problem with UDP is said to be unreliable and not all VPN systems support UDP tunneling. This paper seeks to provide systems with low-latency primitives for reliable communication that are fundamentally scalable and robust. The focus of the authors is on proposing raptor codes to solve the TCP meltdown problems in VPN systems and decrease delays and overheads. The authors of this paper will simulate the TCP meltdown problem inside a VPN tunnel.
- Published
- 2018
29. A Novel Prioritised Concealment and Flexible Macroblock Ordering Scheme for Video Transmission
- Author
-
D. Indoonundon, K. M. S Soyjaudah, and Tulsi Pawan Fowdur
- Subjects
Flexible Macroblock Ordering ,Scheme (programming language) ,021110 strategic, defence & security studies ,Computer science ,Real-time computing ,0211 other engineering and technologies ,0202 electrical engineering, electronic engineering, information engineering ,020206 networking & telecommunications ,02 engineering and technology ,Video transmission ,computer ,computer.programming_language - Published
- 2016
30. A PEG Construction of LDPC Codes Based on the Betweenness Centrality Metric
- Author
-
I. Bhurtah-Seewoosungkur, K. M. S. Soyjaudah, and Pierre Clarel Catherine
- Subjects
Discrete mathematics ,lcsh:Computer engineering. Computer hardware ,General Computer Science ,channel coding ,AWGN channels ,020206 networking & telecommunications ,lcsh:TK7885-7895 ,02 engineering and technology ,Combinatorics ,block codes ,error correction codes ,Betweenness centrality ,Metric (mathematics) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,Electrical and Electronic Engineering ,Low-density parity-check code ,parity check codes ,lcsh:TK1-9971 ,Mathematics ,MathematicsofComputing_DISCRETEMATHEMATICS - Abstract
Progressive Edge Growth (PEG) constructions are usually based on optimizing the distance metric by using various methods. In this work however, the distance metric is replaced by a different one, namely the betweenness centrality metric, which was shown to enhance routing performance in wireless mesh networks. A new type of PEG construction for Low-Density Parity-Check (LDPC) codes is introduced based on the betweenness centrality metric borrowed from social networks terminology given that the bipartite graph describing the LDPC is analogous to a network of nodes. The algorithm is very efficient in filling edges on the bipartite graph by adding its connections in an edge-by-edge manner. The smallest graph size the new code could construct surpasses those obtained from a modified PEG algorithm - the RandPEG algorithm. To the best of the authors' knowledge, this paper produces the best regular LDPC column-weight two graphs. In addition, the technique proves to be competitive in terms of error-correcting performance. When compared to MacKay, PEG and other recent modified-PEG codes, the algorithm gives better performance over high SNR due to its particular edge and local graph properties.
- Published
- 2016
31. A novel scaling and early stopping mechanism for LTE turbo code based on regression analysis
- Author
-
K. M. S. Soyjaudah, Yogesh Beeharry, and Tulsi Pawan Fowdur
- Subjects
Early stopping ,010102 general mathematics ,020206 networking & telecommunications ,02 engineering and technology ,Mutual information ,01 natural sciences ,QAM ,Amplitude modulation ,0202 electrical engineering, electronic engineering, information engineering ,Turbo code ,Electronic engineering ,0101 mathematics ,Electrical and Electronic Engineering ,Algorithm ,Scaling ,Decoding methods ,Mathematics ,Phase-shift keying - Abstract
In this paper, a new extrinsic information scaling and early stopping mechanism for long term evolution (LTE) turbo code is proposed. A scaling factor is obtained by computing the Pearson’s correlation coefficient between the extrinsic and a posteriori log-likelihood ratio (LLR) at every half-iteration. Additionally, two new stopping criteria are proposed. The first one uses the regression angle which is computed at each half-iteration and is applied at low E b/N 0. The second one uses Pearson’s correlation coefficient and is applicable for high E b/N 0 values. The performance of the proposed scheme was compared against an existing scaling and stopping mechanism based on the sign difference ratio (SDR) technique as well as conventional LTE turbo code. Simulations have been performed with both quadrature phase shift keying (QPSK) modulation and 16-quadrature amplitude modulation (QAM) together with code rates of 1/3 and 1/2. The results demonstrate that the proposed scheme outperforms both the conventional scheme and that employing the SDR-based scaling and stopping mechanism in terms of BER performance and average number of decoding iterations. The performance analysis using EXIT charts for each scheme shows higher initial output mutual information for input mutual information of zero. Better convergence is also demonstrated with the wider tunnel for the proposed scheme. Additionally, the computational complexity analysis demonstrates a significant gain in terms of the average number of computations per packet with the different modulation and coding schemes while still gaining in terms of error performance.
- Published
- 2016
32. Design and Implementation of UDP Tunneling-based on OpenSSH VPN
- Author
-
Pierre Clarel Catherine, K. M. S. Soyjaudah, and Irfaan Coonjah
- Subjects
Open source ,Computer science ,business.industry ,Bandwidth (signal processing) ,Encryption ,business ,Computer network - Abstract
This paper focuses on two commonly used VPNs; OpenVPN and OpenSSH. Both VPN solutions are Open Source and are cross-platform, secure, highly configurable. OpenSSH forms part of a big research group, OpenBSD and is integrated in all routers, switches and almost all operating systems by default. Compared to OpenSSH, OpenVPN is less widely used and the research group is small. OpenVPN can be used only for tunneling purposes, whereas OpenSSH has many features and tunneling is only one of the features. The research by the OpenBSD developers is moving towards security and encryption and has not progress in the field of VPN tunneling since 2011. The only weakness with OpenSSH VPN is that is does not support UDP as the mode of communication, whereas OpenVPN can use both TCP and UDP as the mode of communication. There is a need for UDP tunnel in industries in situations where bandwidth is critical (satellite) and the default option for VPN systems is OpenVPN. The authors of this paper modified the OpenSSH implementation was by adding support for a UDP base connection to its VPN functionality so that OpenSSH can be the VPN of choice for Industries.
- Published
- 2018
33. Unveiling the solar resource potential for photovoltaic applications in Mauritius
- Author
-
K. M. S. Soyjaudah and Yatindra Kumar Ramgolam
- Subjects
Meteorology ,Renewable Energy, Sustainability and the Environment ,business.industry ,Photovoltaics ,Solar Resource ,Photovoltaic system ,Irradiance ,Environmental science ,business ,Solar maximum ,Solar irradiance ,Solar energy ,Solar power - Abstract
Mauritius is considered to have high solar resource potential but it has not yet been fully quantified and exploited due to the lack of valid solar energy data. This paper unveils the solar potential of Mauritius. Ground-based measurements were performed at intervals of 30 s in order to obtain accurate global horizontal irradiance data which can depict all changes in solar power. The latter is used to evaluate average monthly global horizontal irradiance, maximum irradiance, monthly average insolation and monthly sky clearness index. A solar geometry model was used to define the average monthly, seasonal and yearly maximum elevations and extraterrestrial radiation. Measurement data were compared to Meteonorm and NASA SSE 3-hourly averaged solar data. Comparison shows that average irradiance values are in good agreement, whereas insolation and sky clearness values obtained from external sources are inferior to high quality measurement data. The results, presented in this paper, complement solar data of Meteonorm and NASA SSE and secondly, provides PV and solar engineers as well as scientists with highly valuable information on the solar resource of Mauritius that can be used during planning and design of PV systems as well as for conducting further research in Mauritius and surrounding regions.
- Published
- 2015
34. Secured SAML cloud authentication using fingerprint
- Author
-
Gianeshwar Ramsawock, Muhammad Yaasir Khodabacchus, and K. M. S. Soyjaudah
- Subjects
Password ,Engineering ,Authentication ,business.industry ,Fingerprint (computing) ,XML Signature ,Computer security ,computer.software_genre ,Credential ,Security Assertion Markup Language ,ComputingMilieux_MANAGEMENTOFCOMPUTINGANDINFORMATIONSYSTEMS ,CVSS ,business ,computer ,Vulnerability (computing) - Abstract
Single Sign-On (SSO) is a centralized certification method where consumer gains use of many services using one credential. This evolution is driving the next generation of authentication on the cloud. However, Single Sign-On has one big point of failure. If an attacker has obtained one credential, he may obtain use of several providers. SAML (Security Assertion Markup Language) SSO protocol, which is widely used with a password, suffers from this vulnerability. In the past, researchers have focused their attention on XML Signature Wrapping vulnerability and very less on the main component of authentication. This research provides a solution to such problem by combining the dominant features of cancelable fingerprints in existing SAML SSO protocol. During our tests, the effectiveness of the solution has been examined using CVSS (Common Vulnerability Scoring System). Based on the scores provided, it has therefore been concluded that the new combined solution of cancelable fingerprint and SAML SSO protocol has a better protection against the attacks. Nevertheless, the score can further be improved.
- Published
- 2017
35. Fast adaptive inter-splitting decisions for HEVC based on luma residuals
- Author
-
K. M. S. Soyjaudah, Pierre Clarel Catherine, and Kanayah Saurty
- Subjects
Reduction (complexity) ,Motion estimation ,Algorithmic efficiency ,Real-time computing ,Quadtree ,Encoder ,Algorithm ,Coding tree unit ,Random access ,Coding (social sciences) ,Mathematics - Abstract
The long encoding time of High Efficiency Video Coding (HEVC) compared to its predecessor, Advanced Video Coding (AVC), is mostly associated with the large number of Coding Units (CUs) to be processed during the quad tree splitting of the 64 × 64 Coding Tree Unit (CTU) along with the improved but intensive Motion Estimation (ME) techniques. In this paper, the unnecessary processing of some CUs during the recursive splitting of the CTU along with some of the two-PUs mode operations are skipped so as to bring significant time reduction for the encoder. Statistical distributions of the HEVC splitting decisions based on the Mean Square (MS) values of each 8 × 8 block within the 2N × 2N luma residuals are adaptively constructed during the starting frames for each sequence. Thereafter, thresholds for early termination of the CU and early identification of the 2N × 2N PU mode based on these distributions are applied during the encoding of subsequent inter frames. The proposed inter-mode scheme significantly reduces the total encoding time with negligible loss of coding efficiency. Experimental results show that the proposed scheme effectively achieves 47.0% encoding time savings with a Bjontegaard Delta bitrate (BDBR) increase of only 0.57% for various test sequences under random access conditions.
- Published
- 2017
36. Performance Analysis of Symmetric and Asymmetric LTE Turbo Codes with Prioritisation and Regression Based Scaling
- Author
-
K. M. S. Soyjaudah, Tulsi Pawan Fowdur, and Yogesh Beeharry
- Subjects
QAM ,Hardware_GENERAL ,Modulation ,Computer science ,Hybrid system ,Electronic engineering ,Range (statistics) ,Turbo code ,Data_CODINGANDINFORMATIONTHEORY ,Scaling ,Regression ,Term (time) - Abstract
Standards like Long Term Evolution (LTE) employ Turbo coded QAM systems in order to achieve high data rates. Despite the fact that several mechanisms have been proposed in order to enhance the error performance of Turbo coded QAM systems, there is still the need to come up with novel or hybrid systems which can contribute towards further improved error performances. In this paper, a comparative analysis has been performed between symmetric and asymmetric LTE Turbo codes with the incorporation of techniques such as prioritization and regression based extrinsic information scaling. Results demonstrate that significant enhancement in the error performance throughout the whole Eb/N0 range can be obtained with high order modulation when these techniques are used. With both symmetric and asymmetric LTE Turbo codes employing 64-QAM and a code-rate of 1/3, an average gain of 0.3 dB below BERs of 10−1 is obtained over symmetric and asymmetric LTE Turbo codes.
- Published
- 2017
37. Performance of Unequal Error Protection Schemes for Audio Transmission Over ADSL with Reed Solomon and Turbo Codes
- Author
-
Prateema Ragpot, Tulsi Pawan Fowdur, and K. M. S. Soyjaudah
- Subjects
Asymmetric digital subscriber line ,business.industry ,Computer science ,Data_CODINGANDINFORMATIONTHEORY ,Division (mathematics) ,Transmission (telecommunications) ,Modulation ,Reed–Solomon error correction ,Electronic engineering ,Code (cryptography) ,Turbo code ,Telecommunications ,business ,Block (data storage) - Abstract
This paper performs a comparative analysis of Unequal Error Protection (UEP) schemes for audio transmission over Asymmetric Digital Subscriber Line (ADSL). The proposed UEP scheme exploits three different levels of protection. First at the Discrete Multi-tone Modulation (DMT) layer, second with prioritized retransmissions and finally by using different code rates. A comparative analysis between Reed-Solomon (RS) codes and Turbo Codes is performed using two different block sizes. For the Turbo codes, Sign Division Ratio (SDR)-based extrinsic information scaling is also applied to further enhance the performance. Results show that when UEP is applied at three different levels, a gain of 11 dB is obtained over a scheme which applies UEP at only two levels.
- Published
- 2017
38. A Question Answer System for the Mauritian Judiciary
- Author
-
P. Domun, Sameerchand Pudaruth, Rajendra Parsad Gunputh, and K. M. S. Soyjaudah
- Subjects
050502 law ,Computer science ,business.industry ,05 social sciences ,Internet privacy ,Information technology ,ComputingMilieux_LEGALASPECTSOFCOMPUTING ,Computer security ,computer.software_genre ,Supreme court ,Legal research ,Knowledge-based systems ,Knowledge extraction ,Order (business) ,0509 other social sciences ,050904 information & library sciences ,business ,Human resources ,Know-how ,computer ,Natural language ,0505 law - Abstract
Law is a research-oriented profession and legal research is an activity that costs time and money. Information Technology is now revolutionising the way in which legal research is being done. In this work, we have implemented an online web-based question answer system for the Mauritian Judiciary where users can enter their queries freely using natural language. The system processes the queries by extracting relevant keywords and discards those that do not carry much information and then returns the relevant sections of law which contain these keywords or keyphrases. The system also returns a list of relevant Supreme Court cases. The user can decide on the number of results to be displayed. The user can also wish to have only the name of the relevant acts be displayed for certain keywords or keyphrases. The system does not require the user to know how the law is structured or how the knowledge-base is built in order to benefit from it. The portal can also be accessed via mobile devices without compromising any of its facilities or user-friendliness. It is hoped that the availability of information at the click of a button will help the human resources at the Mauritian Judiciary to become more efficient and this will contribute to the reduction of delays in the disposal of cases.
- Published
- 2016
39. Fingerprint code authentication protocol on cloud
- Author
-
Gianeswar Ramsawok, Muhammad Yaasir Khodabacchus, and K. M. S. Soyjaudah
- Subjects
021110 strategic, defence & security studies ,Authentication ,Cloud computing security ,Biometrics ,business.industry ,Computer science ,Data_MISCELLANEOUS ,Internet privacy ,Fingerprint (computing) ,0211 other engineering and technologies ,Cloud computing ,02 engineering and technology ,Fingerprint recognition ,Security token ,Computer security ,computer.software_genre ,Authentication protocol ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,computer - Abstract
Cloud computing is introducing several immense changes to people's life style and dealing pattern recently for its innumerous advantages. However, potential security issues are often the barrier for its widespread applications. Modern biometric technologies such as fingerprints claim to produce a different answer to cloud security. Even though there are areas which biometrics method provides benefits, it is at risk of attacks. Towards addressing these problems and rising public confidence, a novel protocol for biometrics authentication on stolen token based on cancelable is presented.
- Published
- 2016
40. Risk score calculation for cloud biometric authentication
- Author
-
K. M. S. Soyjaudah, Gianeswar Ramsawok, and Muhammad Yaasir Khodabacchus
- Subjects
Password ,020205 medical informatics ,Biometrics ,business.industry ,Computer science ,Cloud computing ,02 engineering and technology ,Computer security ,computer.software_genre ,Security token ,Authentication (law) ,Ranking ,SAFER ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business ,computer ,Vulnerability (computing) - Abstract
Biometric-established authentication method is maturing into fitting one probably the most promising candidate for either replacing or bettering normal approaches founded on password or token on the cloud. This technological know-how is a deterministic characteristic to provide a safer and more user friendly approach to authentication. However some vulnerability threatens the adoption of biometrics. One of which is due to the fact customers can access assets from different locations on different devices and one person can have many scanners. There exists without doubt a risk associated with permitting a request to access the useful resource. With the intention to contribute in resolving such challenge, a risk rating calculation is endorsed. It is a procedure during which the threat engine will calculate a ranking. The ranking will probably be in comparison with a threshold set and a resolution whether or not to proceed with authentication will likely be made founded on this assessment
- Published
- 2016
41. Evaluation of UDP tunnel for data replication in data centers and cloud environment
- Author
-
Irfaan Coonjah, K. M. S. Soyjaudah, and Pierre Clarel Catherine
- Subjects
021110 strategic, defence & security studies ,business.industry ,Computer science ,Network packet ,Reliability (computer networking) ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,05 social sciences ,0211 other engineering and technologies ,050801 communication & media studies ,Cloud computing ,02 engineering and technology ,UDP flood attack ,Replication (computing) ,0508 media and communications ,Packet loss ,Synchronization (computer science) ,Data center ,business ,Computer network - Abstract
The era of Data Centers is underway. Cloud is an off-premise form of computing that stores data on the Internet, whereas a data center refers to on-premise. For cloud-hosting purposes, vendors also often own multiple data centers in several geographic locations to safeguard data availability during outages and other data center failures. Data replication and synchronization techniques have recently attracted a lot of attention of researchers from computing community. Data transferred during replication needs to be secure. Tunneling is one way to secure information before sending it to off-premise replication site. The two kinds of tunneling that exists are TCP and UDP tunnel. UDP tunnel is claimed to be faster during the transfer of data when compared to TCP tunnel, the main reason is that UDP does not make use of excessive acknowledgment messages and also UDP does not suffer the tcp-meltdown problem. In situations where bandwidth is limited, UDP is the preferred option. This paper addresses the reliability of UDP tunnel, while measuring the packet drops when sending different packet sizes. The authors want to determine whether UDP tunnel can be used as the mode of transfer during data replication. A series of tests have been performed and the MTU size have been adjusted for a minimal packet loss. The authors demonstrate that UDP tunnel with an MTU size of 1150 bytes can be used as the mode of transfer for data centers.
- Published
- 2016
42. Performance modelling and assessment of photovoltaic systems: A case for tropical region
- Author
-
Yatindra Kumar Ramgolam and K. M. S. Soyjaudah
- Subjects
Engineering ,business.industry ,020209 energy ,Photovoltaic system ,02 engineering and technology ,Solar irradiance ,Power (physics) ,Electric power system ,0202 electrical engineering, electronic engineering, information engineering ,Grid-connected photovoltaic power system ,Electronic engineering ,business ,Simulation ,Energy (signal processing) ,Voltage - Abstract
During the research, the performance of a 2.45 kWp photovoltaic system power system (PVPS) located in a tropical region was measured and modeled. A bottom up cell to array approach was used to simulate the PVPS performance. Cell parameters were extracted using 4- and 5-parameter models. The PV system power, current and voltage output was modelled using one-diode model in consideration of solar irradiance data which was measured at intervals of 30-seconds. The latter results were compared to results obtained through simulation of the 2.45 kWp system in NREL SAM. Finally the simulation results were compared to measured system's energy output. It can be concluded that only the proposed cell to array approach can be reliably used to assess a PVPS output current, voltage, power and energy with high accuracy. Compared to other models, results of the proposed approach were closer to recorded energy output values.
- Published
- 2016
43. Fast Intra Mode Decision for HEVC
- Author
-
Kanayah Saurty, Pierre Clarel Catherine, and K. M. S. Soyjaudah
- Subjects
Computer science ,Prediction methods ,Time saving ,Rate distortion ,Algorithm ,Intra mode ,Encoder complexity ,Coding (social sciences) ,Data compression - Abstract
High Efficiency Video Coding (HEVC/H.265), the latest standard in video compression, aims to halve the bitrate while maintaining the same quality or to achieve the same bitrate with an improved quality compared to its predecessor, AVC/H.264. However, the increase in prediction modes in HEVC significantly impacts on the encoder complexity. Intra prediction methods indeed iterate among 35 modes for each Prediction Unit (PU) to select the most optimal one. This mode decision procedure which consumes around 78 % of the time spent in intra prediction consists of the Rough Mode Decision (RMD), the simplified Rate Distortion Optimisation (RDO) and the full RDO processes. In this chapter considerable time reduction is achieved by using techniques that use fewer modes in both the RMD and the simplified RDO processes. Experimental results show that the average time savings of the proposed method indeed yields a 42.1 % time savings on average with an acceptable drop of 0.075 dB in PSNR and a negligible increase of 0.27 % in bitrate.
- Published
- 2016
44. Performance evaluation and analysis of layer 3 tunneling between OpenSSH and OpenVPN in a wide area network environment
- Author
-
Pierre Clarel Catherine, Irfaan Coonjah, and K. M. S. Soyjaudah
- Subjects
Computer science ,business.industry ,Secure Shell ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Shared medium ,Encryption ,Tunneling protocol ,law.invention ,Wide area network ,law ,Internet Protocol ,The Internet ,business ,Private network ,Computer network - Abstract
Virtual Private Networks (VPNs) provide a secure encrypted communication between remote networks worldwide by using Internet Protocol(IP) tunnels and a shared medium like the Internet. End-to-end connectivity is established by tunneling. OpenVPN and OpenSSH are cross-platform, secure, highly configurable VPN solutions. The performance comparison however between OpenVPN and OpenSSH VPN has not yet been undertaken. This paper focuses on such comparison and evaluates the efficiency of these VPNs over Wide Area Network (WAN) connections. The same conditions are maintained for a fair comparison. To the best knowledge of the authors, this is the first reported test results of these two commonly used VPN technologies. Three parameters, namely speed, latency and jitter are evaluated. Using a real life scenario with deployment over the Linux Operating System, a comprehensive in-depth comparative analysis of the VPN mechanisms is provided. Results of the analysis between OpenSSH and OpenVPN show that OpenSSH utilizes better the link and significantly improves transfer times.
- Published
- 2015
45. Experimental performance comparison between TCP vs UDP tunnel using OpenVPN
- Author
-
K. M. S. Soyjaudah, Irfaan Coonjah, and Pierre Clarel Catherine
- Subjects
Engineering ,TCP acceleration ,Point-to-Point Tunneling Protocol ,business.industry ,Datagram ,Performance comparison ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,TCP hole punching ,Zeta-TCP ,Latency (engineering) ,business ,UDP flood attack ,Computer network - Abstract
The comparison between TCP and UDP tunnels have not been sufficiently reported in the scientific literature. In this work, we use OpenVPN as a platform to demonstrate the performance between TCP/UDP. The de facto belief has been that TCP tunnel provides a permanent tunnel and therefore ensures a reliable transfer of data between two end points. However the effects of transmitting TCP within a UDP tunnel has been explored and could provide a valuable attempt. The results provided in this paper demonstrates that indeed TCP in UDP tunnel provides better latency. Throughout this paper, a series of tests have been performed, UDP traffic was sent inside UDP tunnel and TCP tunnel successively. The same tests was performed using TCP traffic.
- Published
- 2015
46. 6to4 tunneling framework using OpenSSH
- Author
-
Irfaan Coonjah, Pierre Clarel Catherine, and K. M. S. Soyjaudah
- Subjects
Computer science ,computer.internet_protocol ,business.industry ,Secure Shell ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Tunneling protocol ,IPv4 ,IPv6 ,Hardware_INTEGRATEDCIRCUITS ,6to4 ,business ,computer ,Quantum tunnelling ,Computer network - Abstract
6to4 tunneling enables IPv6 hosts and routers to connect with other IPv6 hosts and routers over the existing IPv4Internet. The main purpose of IPv6 tunneling is to maintain compatibility with large existing base of IPv4 hosts and routers. OpenSSH VPN tunneling is said to have limitations with numerous IPv6 clients and therefore it is advisable to use OpenVPN. To the best knowledge of the authors, this is the first reported successful implementation of 6to4 tunneling over OpenSSH with more than one client. This proof-of-concept positions OpenSSH therefore as a potential alternative to conventional VPNs.
- Published
- 2015
47. Performance of duo-binary turbo codes with optimized hierarchical modulation
- Author
-
Yogesh Beeharry, K. M. S. Soyjaudah, and Tulsi Pawan Fowdur
- Subjects
Computer science ,020208 electrical & electronic engineering ,0202 electrical engineering, electronic engineering, information engineering ,Turbo code ,Binary number ,020206 networking & telecommunications ,Hierarchical modulation ,02 engineering and technology ,Algorithm - Published
- 2018
48. Joint source channel decoding and iterative symbol combining with turbo trellis-coded modulation
- Author
-
T. P. Fowdur and K. M. S. Soyjaudah
- Subjects
Iterative method ,Computer science ,Variable-length code ,Data_CODINGANDINFORMATIONTHEORY ,Trellis (graph) ,Huffman coding ,symbols.namesake ,Control and Systems Engineering ,Signal Processing ,symbols ,Turbo code ,Electronic engineering ,Computer Vision and Pattern Recognition ,Electrical and Electronic Engineering ,Trellis modulation ,Algorithm ,Software ,Decoding methods ,Communication channel - Abstract
A new joint source channel decoding (JSCD) scheme for turbo trellis-coded modulation is proposed which exploits symbol-a-priori probabilities to improve decoding performance. To obtain symbol-a-priori probabilities, we propose a new method which uses a time-stretched bit-level trellis of a variable length code (VLC) decoder alongside a symbol level trellis. Iterative symbol combining (ISC) is also integrated with the proposed JSCD scheme so that multiple transmissions of the same data packet can be combined to improve decoding. Essentially JSCD with ISC (JSCD-ISC) exploits two types of a-priori information. The first one being the symbol-a-priori probabilities derived from source statistics, and the second one is obtained from the saved extrinsic information of previously transmitted copies of the data. Several simulations with different VLC sources like Huffman and reversible variable length codes (RVLC), and for different number of retransmissions have been performed. The most striking result is obtained with RVLC, whereby using only two retransmissions, the JSCD-ISC scheme outperforms a conventional scheme using three retransmissions by 0.2dB.
- Published
- 2009
49. EFFICIENT RECOVERY TECHNIQUE FOR LOW-DENSITY PARITY-CHECK CODES USING REDUCED-SET DECODING
- Author
-
K. M. S. Soyjaudah and Pierre Clarel Catherine
- Subjects
Block code ,Theoretical computer science ,Concatenated error correction code ,BCJR algorithm ,General Medicine ,Sequential decoding ,Serial concatenated convolutional codes ,Linear code ,Hardware and Architecture ,Electrical and Electronic Engineering ,Low-density parity-check code ,Algorithm ,Factor graph ,Mathematics - Abstract
We introduce a recovery algorithm for low-density parity-check codes that provides substantial coding gain over the conventional method. Concisely, it consists of an inference procedure based on successive decoding rounds using different subsets of bit nodes from the bipartite graph representing the code. The technique also sheds light on certain characteristics of the sum–product algorithm and effectively copes with the problems of trapping sets, cycles, and other anomalies that adversely affect the performance LDPC codes.
- Published
- 2008
50. Robust JPEG image transmission using unequal error protection and code combining
- Author
-
T. P. Fowdur and K. M. S. Soyjaudah
- Subjects
Computer Networks and Communications ,Electrical and Electronic Engineering - Published
- 2007
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.