53 results on '"Suriayati Chuprat"'
Search Results
2. Acute lymphoblastic leukemia segmentation using local pixel information
- Author
-
Saif S. Al-jaboriy, Wafaa Mustafa Abduallah, Suriayati Chuprat, and Nilam Nur Amir Sjarif
- Subjects
Computer science ,Lymphoblastic Leukemia ,Cell segmentation ,Image processing ,02 engineering and technology ,01 natural sciences ,Artificial Intelligence ,Precursor cell ,0103 physical sciences ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,Segmentation ,Sensitivity (control systems) ,010306 general physics ,Pixel ,Artificial neural network ,business.industry ,Pattern recognition ,medicine.anatomical_structure ,Signal Processing ,020201 artificial intelligence & image processing ,Computer Vision and Pattern Recognition ,Bone marrow ,Artificial intelligence ,Noise (video) ,business ,Software - Abstract
The severity of acute lymphoblastic leukemia depends on the percentages of blast cells (abnormal white blood cells) in bone marrow or peripheral blood. The manual microscopic examination of bone marrow is less accurate, time-consuming, and susceptible to errors, thus making it difficult for lab workers to accurately recognize the characteristics of blast cells. Researchers have adopted different computational methods to identify the nature of blast cells; however, these methods are incapable of accurately segmenting leukocyte cells due to some major disadvantages, such as lack of contrast between objects and background, sensitivity to gray-scale, sensitivity to noise in images, and large computational size. Therefore, it is indispensable to develop a new and improved technique for leukocyte cell segmentation. In the present research, an automatic leukocyte cell segmentation process was introduced that is based on machine learning approach and image processing technique. Further, the characteristics of blast cells were extracted using 4-moment statistical features and artificial neural networks (ANNs). It was found that the proposed method yielded a blasts cell segmentation accuracy of 97% under different lighting conditions.
- Published
- 2019
- Full Text
- View/download PDF
3. Hybrid Heuristic Load Balancing Algorithm For Resource Allocation In Cloud Computing
- Author
-
Suriayati Chuprat, Anup Shrestha, and Nandini Mukherjee
- Subjects
Computer science ,Heuristic (computer science) ,business.industry ,Virtual machine ,Cloud computing ,Completion time ,Load balancing (computing) ,business ,computer.software_genre ,computer ,Algorithm - Abstract
Cloud computing is becoming more popular, unlike conventional computing, due to its added advantages. This is because it offers utility-based services to its subscribers upon their demand. Furthermore, this computing environment provides IT services to its users where they pay for every use. However, the increasing number of tasks requires virtual machines for them to be accomplished quickly. Load balancing a critical concern in cloud computing due to the massive increase in users' numbers. This paper proposes the best heuristic load balancing algorithm that will schedule a strategy for resource allocation that will minimize make span (completion time) in any technology that involves use cloud computing. The proposed algorithm performs better than other load balancing algorithms.
- Published
- 2020
- Full Text
- View/download PDF
4. Risk Management Framework for Distributed Software Team: A Case Study of Telecommunication Company
- Author
-
Wan Suzila Wan Husin, Azri Azmi, Yazriwati Yahya, Nurulhuda Firdaus Mohd Azmi, Nilam Nur Amir Sjarif, and Suriayati Chuprat
- Subjects
Process management ,business.industry ,Computer science ,Risk management framework ,Organizational culture ,020206 networking & telecommunications ,Context (language use) ,02 engineering and technology ,Knowledge sharing ,Survey methodology ,Software ,0202 electrical engineering, electronic engineering, information engineering ,General Earth and Planetary Sciences ,020201 artificial intelligence & image processing ,business ,Risk management ,General Environmental Science - Abstract
Distributed software development (DSD) has grown rapidly over the last few years and present definite risk to the software industry which need to be carefully analyzed and managed. The risks relating to DSD include lack of trust, ineffective communication, time zone difference, cultural differences such as language and corporate culture along with knowledge sharing challenges. Software risk management approach in DSD, however, is still inadequate and needed further attention. The aim of this paper is to identify the components involved in risk management process related to DSD, and finally to enhance existing risk management framework used in the organization to accommodate the distributed nature of the team. The quantitative approach which is survey method has been chosen as an appropriate research method to achieve the objectives of this paper. The results show that communication is the most prevalent risk faced by the team members in the current risk management practice. The data from the literature and survey were used to expand the list of identified risk in the organization and integrates communication element into the existing framework. The communication element emphasized on the collaboration and commitment of the stakeholders from every site so as to improve risk management in a distributed context.
- Published
- 2019
- Full Text
- View/download PDF
5. Determining Factors Influencing the Acceptance of Cloud Computing Implementation
- Author
-
Suriayati Chuprat, Roslina Ibrahim, Mohd Talmizie Amron, and Nur Azaliah Abu Bakar
- Subjects
Government ,Knowledge management ,business.industry ,Vendor ,Computer science ,Data security ,Cloud computing ,Business continuity ,Scalability ,General Earth and Planetary Sciences ,Thematic analysis ,business ,General Environmental Science ,Social influence - Abstract
Cloud computing (CC) has attracted many organizations to invest in this virtual storage technology since it is seen to be able to help businesses in managing and sharing data in a more flexible, cost saving, and business scalability. However, there are many issues faced by CC, such as data security concern, the high cost during the set-up process, designing the cloud model as well as high dependency on the cloud providers. Apart from these issues, several studies have also highlighted that several other factors are influencing the acceptance of CC implementation, especially in the organizational environment. Therefore, this paper aims to review and identify the relevant factors that influence the acceptance of CC implementation in the organization. This study reviewed 55 articles related to CC implementation, and a total of 21 factors have been obtained through the several processes. These factors were arranged according to the frequency based on the thematic analysis method. As a result, 21 factors were obtained and ranked; Compatibility, Top Management Support, Relative Advantage, Security, Complexity, External Pressure, IT Knowledge, Cost, Trust, Trialability, Regulations & Government Support, Innovativeness, External expertise, Sharing & Collaboration, User experiences, Awareness, Firm Size, Social Influence, Task, Vendor Support, and Business Continuity. This study managed to unveil a new context in cloud computing studies as it provides more insights on the factors that may have been untouched.
- Published
- 2019
- Full Text
- View/download PDF
6. The Validity and Reliability Evaluation of Instruments for Cloud Computing Acceptance Study
- Author
-
Suriayati Chuprat, Roslina Ibrahim, Mohd Talmizie Amron, and Nur Azaliah Abu Bakar
- Subjects
Data collection ,010308 nuclear & particles physics ,business.industry ,Computer science ,05 social sciences ,Validity ,Cloud computing ,01 natural sciences ,Data science ,Exploratory factor analysis ,Cronbach's alpha ,0502 economics and business ,0103 physical sciences ,Content validity ,050211 marketing ,business ,Reliability (statistics) ,Face validity - Abstract
Online data storage technology over the cloud network has become an option for many organizations, even for personal use. The benefits of cloud computing enable many organizations, including the public sector, to use this technology to provide the best service experience. However, there is an issue with the implementation of cloud-based applications when their usage is less than the number of applications offered. Therefore, a study on the acceptance of cloud computing in the public sector should be conducted. This paper aims to evaluate the validity and reliability of the instrument for cloud computing acceptance in Malaysian public sectors. The developed instruments are analyzed through validity and reliability phases. The validity analysis phase involves two stages of face validity and expert validity. The Content Validity Index (CVI) is used, and the feedback of the panel is considered in improving the items used. The reliability phase was conducted by performing an analysis to evaluate Cronbach ‘alpha for each item and also testing using Exploratory Factor Analysis (EFA). The final instrument contained 71 items of 5-point Likert scale multiple-choice options, classified under 15 variables. As a result, this instrument is successfully validated and are reliable to be used in the actual data collection.
- Published
- 2020
- Full Text
- View/download PDF
7. A model for technological aspect of e-learning readiness in higher education
- Author
-
Mohd Naz'ri Mahrin, Rasimah Che Mohd Yusoff, Asma Ali Mosa Al-araibi, and Suriayati Chuprat
- Subjects
Knowledge management ,Higher education ,business.industry ,05 social sciences ,Educational technology ,050301 education ,Information technology ,Cloud computing ,Electronic media ,Library and Information Sciences ,Education ,Empirical research ,Information and Communications Technology ,Computer literacy ,0502 economics and business ,050211 marketing ,business ,0503 education - Abstract
The rate of adoption of e-learning has increased significantly in most higher education institutions in the world. E-learning refers to the use of electronic media, educational technology, also; information and communication technology (ICT) in the educational process. The aim for adopting e-learning is to provide students with educational services via the use of ICT. Thus, students can access educational resources from anywhere and at any time. However, the successful implementation of e-learning relies on the readiness to be able to initiate this system because, without proper readiness, the project will probably fail. E-learning readiness refers to the assessment of how ready an institution is to adopt and implement an e-learning project. One of the most important aspects of e-learning readiness is the technological aspect, which plays an important role in implementing an effective and efficient e-learning system. There is currently a lack of arguments concerning the factors that shape the technological aspect of e-learning readiness. The focus of this study is concentrated on the technological aspect of e-learning readiness. A model is proposed which includes eight technological factors, specifically: Software; Hardware; Connectivity; Security; Flexibility of the system; Technical Skills and Support; cloud computing; and Data center. A quantitative study was conducted at six Malaysian public universities, with survey responses from 374 Academic staff members who use e-learning. The empirical study confirmed that seven of the technological factors have a significant effect on e-learning readiness, while one factor (cloud computing) has not yet had a significant impact on e-learning readiness.
- Published
- 2018
- Full Text
- View/download PDF
8. Data Reconstruction Through Sequence Based Mapping in Secured Data Partitioning
- Author
-
Othman Mohd Yusop, Hazila Hasan, Suriayati Chuprat, and Haslina Md Sarkan
- Subjects
Health (social science) ,General Computer Science ,Database ,business.industry ,Computer science ,General Mathematics ,Data reconstruction ,Big data ,Search engine indexing ,General Engineering ,Cloud computing ,computer.software_genre ,Partition (database) ,Secret sharing ,Education ,General Energy ,Data partitioning ,Confidentiality ,Data mining ,business ,computer ,General Environmental Science - Abstract
Recent researches have proposed data partitioning technique with secret sharing to enhance the security in cloud computing. However, its complexity in reconstructing while preserving confidentiality has a limitation of practical use, specifically when it involves a large amount of data. In this paper, we explored the existing mapping technique called partition based indexing that is being used to reconstruct the shares. Nevertheless, we found that its efficiency has decreased when the amount of data increased. Thus, this has motivated us to propose a sequence based mapping to increase the efficiency of data reconstruction in secured data partitioning with secret sharing. The proposed technique has been evaluated through a series of simulation using 10000 data. The performance was evaluated based on the time taken to achieve data reconstruction for different number of shares. As a result, we proved that our proposal, which is named as a sequence based mapping technique has successfully improved more than 40 percent of the performance of data reconstruction compared to indexing technique. As such, we conclude that our proposal on sequence based mapping is an ideal technique for improving performance of data reconstruction in data-partitioning with secret sharing and preserving confidentiality of big data in cloud computing.
- Published
- 2017
- Full Text
- View/download PDF
9. Improving the accuracy of collaborative filtering recommendations using clustering and association rules mining on implicit data
- Author
-
Mohd Naz'ri Mahrin, Haslina Md Sarkan, Suriayati Chuprat, and Maryam Khanian Najafabadi
- Subjects
Association rule learning ,business.industry ,Computer science ,Probabilistic logic ,02 engineering and technology ,Recommender system ,computer.software_genre ,Human-Computer Interaction ,Arts and Humanities (miscellaneous) ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Collaborative filtering ,Preprocessor ,020201 artificial intelligence & image processing ,The Internet ,Data mining ,business ,Cluster analysis ,computer ,General Psychology ,Curse of dimensionality - Abstract
The recommender systems are recently becoming more significant in the age of rapid development of the Internet technology due to their ability in making a decision to users on appropriate choices. Collaborative filtering (CF) is the most successful and most applied technique in the design of recommender systems where items to an active user will be recommended based on the past rating records from like-minded users. Unfortunately, CF may lead to the poor recommendation when user ratings on items are very sparse in comparison with the huge number of users and items in user-item matrix. To overcome this problem, this research applies the users implicit interaction records with items to efficiently process massive data by employing association rules mining. It captures the multiple purchases per transaction in association rules, rather than just counting total purchases made. To do this, a modified preprocessing is implemented to discover similar interest patterns among users based on multiple purchases done. In addition, the clustering technique has been employed in our technique to reduce the size of data and dimensionality of the item space as the performance of association rules mining. Then, similarities between items based on their features were computed to make recommendations. The experiments were conducted and the results were compared with basic CF and other extended version of CF techniques including K-Means clustering, hybrid representation, and probabilistic learning by using public dataset, namely, Million Song dataset. The experimental results demonstrated that our technique achieves the better performance when compared to the basic CF and other extended version of CF techniques in terms of Precision, Recall metrics, even when the data is very sparse. Tackle data sparsity using clustering and association rules mining on massive data.Utilizing users' implicit interaction records with items for improving CF.Using item repetition in a transaction as the input for association rules.Experiments show that the proposed technique substantially outperforms basic CF.Comparing the accuracy of proposed technique with other extended version of CF.
- Published
- 2017
- Full Text
- View/download PDF
10. Software Engineering Wastes - A Perspective of Modern Code Review
- Author
-
Nargis Fatima, Sumaira Nazir, and Suriayati Chuprat
- Subjects
Code review ,Process (engineering) ,business.industry ,Computer science ,Software development ,020207 software engineering ,Context (language use) ,02 engineering and technology ,computer.software_genre ,Lean manufacturing ,Software quality ,Domain (software engineering) ,Software ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Software engineering ,business ,computer - Abstract
Identification and eradication of waste are the principal emphases of lean thinking. Waste is defined as any activity that consumes resources but does not deliver any value to the stakeholder and it can also be demarcated as an impediment to process flow. Lean thinking has been applied in the software engineering domain concerning overall software development, however, still, there is a need to take action regarding waste identification and elimination concerning specific software engineering activities. This paper describes the wastes generated during Modern Code Review (MCR). MCR is a socio-technical software engineering activity and acknowledged as a lightweight process for defect identification, code improvement and software quality enhancement. It involves coordination and communication among multiple software engineers having different personalities, preferences, and technical skills, thus it can generate multiple types of wastes. Therefore, the study has two objectives that are to recognize and report various wastes generated during MCR and to map the identified MCR wastes on the existing software engineering wastes. Systematic Literature Review and grounded theory has been utilized to recognize and produce a unique list of the waste generated during MCR. The identified unique list of MCR wastes and their mapping on existing software engineering wastes are validated through software engineering experts. The study findings report 28 unique wastes out of which 25 wastes map to the existing software engineering wastes. However, 3 wastes such as negative emotions, inequality/biasness and insignificant feedback are not reported in the existing software engineering literature. The study will be useful for researchers to identify the wastes in same context or for other software engineering activities and to provide the strategies to minimize the generation of identified wastes.
- Published
- 2020
- Full Text
- View/download PDF
11. Knowledge Sharing Factors for Modern Code Review to Minimize Software Engineering Waste
- Author
-
Sumaira Nazir, Nargis Fatima, and Suriayati Chuprat
- Subjects
Code review ,General Computer Science ,Computer science ,business.industry ,Artifact (software development) ,computer.software_genre ,Grounded theory ,Knowledge sharing ,Mental distress ,Systematic review ,Categorization ,Code (cryptography) ,Software engineering ,business ,computer - Abstract
Software engineering activities, for instance, Modern Code Review (MCR) produce quality software by identifying the defects from the code. It involves social coding and provides ample opportunities to share knowledge among MCR team members. However, the MCR team is confronted with the issue of waiting waste due to poor knowledge sharing among MCR team members. As a result, it delays the project delays and increases mental distress. To minimize the waiting waste, this study aims to identify knowledge sharing factors that impact knowledge sharing in MCR. The methodology employed for this study is a systematic literature review to identify knowledge sharing factors, data coding with continual comparison and memoing techniques of grounded theory to produce a unique and categorized list of factors influencing knowledge sharing. The identified factors were then assessed through expert panel for its naming, expressions, and categorization. The study finding reported 22 factors grouped into 5 broad categories. i.e. Individual, Team, Social, Facility conditions, and Artifact. The study is useful for researchers to extend the research and for the MCR team to consider these factors to enhance knowledge sharing and to minimize waiting waste.
- Published
- 2020
- Full Text
- View/download PDF
12. Situational Modern Code Review Framework to Support Individual Sustainability of Software Engineers
- Author
-
Nargis Fatima, Sumaira Nazir, and Suriayati Chuprat
- Subjects
Code review ,Source code ,Process management ,General Computer Science ,business.industry ,Process (engineering) ,Computer science ,media_common.quotation_subject ,Delphi method ,Software development ,computer.software_genre ,Variety (cybernetics) ,Identification (information) ,Software ,Sustainability ,Quality (business) ,Situational ethics ,business ,computer ,Competence (human resources) ,media_common - Abstract
Modern Code Review (MCR) is a socio-technical practice to improve source code quality and ensure successful software development. It involves the interaction of software engineers from different cultures and backgrounds. As a result, a variety of unknown situational factors arise that impact the individual sustainability of MCR team members and affect their productivity by causing mental distress, fear of unknown and varying situations. Therefore, the MCR team needs to be aware of the accurate situational factors, however, they are confronted with the issue of lack of competency in the identification of situational factors. This study aims to conduct the Delphi survey to investigate the optimal and well-balanced MCR-related situational factors. The conducted survey also aimed to recognize and prioritize the most influencing situational factors for MCR activities. The study findings reported 21 situational factors, 147 sub-factors, and 5 Categories. Based on the results of the Delphi survey the identified situational factors are transformed into the situational MCR framework. This study might be helpful to support the individual sustainability of the MCR team by making them aware of the situations that can occur and vary during the execution of the MCR process. This research might also help the MCR team to improve their productivity and sustain in the industry for longer. It can also support software researchers who want to contribute to situational software engineering from varying software engineering contexts.
- Published
- 2020
- Full Text
- View/download PDF
13. Knowledge Sharing Framework for Modern Code Review to Diminish Software Engineering Waste
- Author
-
Nargis Fatima, Sumaira Nazir, and Suriayati Chuprat
- Subjects
Code review ,General Computer Science ,business.industry ,Computer science ,Delphi method ,Psychological distress ,computer.software_genre ,Knowledge sharing ,Waste production ,Lean software development ,business ,Software engineering ,Quality assurance ,computer - Abstract
Modern Code Review (MCR) is a quality assurance technique that involves massive interactions between team members of MCR. Presently team members of MCR are confronting with the problem of waiting waste production, which results in their psychological distress and project delays. Therefore, the MCR team needs to have effective knowledge sharing during MCR activities, to avoid the circumstances that lead the team members to the waiting state. The objective of this study is to develop the knowledge sharing framework for MCR team to reduce waiting waste. The research methodology used for this study is the Delphi survey. The conducted Delphi survey intended to produce the finalized list of knowledge sharing factors and to recognize and prioritize the most influencing knowledge sharing factor for MCR activities. The study results reported 22 knowledge sharing factors, 135 sub-factor, and 5 categories. Grounded on the results of the Delphi survey the knowledge sharing framework for MCR has been developed. The study is beneficial for software engineering researchers to outspread the research. It can also help the MCR team members to consider the designed framework to increase knowledge sharing and diminish waiting waste.
- Published
- 2020
- Full Text
- View/download PDF
14. Situational Factors for Modern Code Review to Support Software Engineers’ Sustainability
- Author
-
Nargis Fatima, Suriayati Chuprat, and Sumaira Nazir
- Subjects
Source code ,Code review ,Knowledge management ,General Computer Science ,Situation awareness ,Process (engineering) ,Computer science ,business.industry ,media_common.quotation_subject ,020207 software engineering ,02 engineering and technology ,computer.software_genre ,Grounded theory ,Identification (information) ,Software ,Sustainability ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Situational ethics ,business ,computer ,media_common - Abstract
Software engineers working in Modern Code Review (MCR) are confronted with the issue of lack of competency in the identification of situational factors. MCR is a software engineering activity for the identification and fixation of defects before the delivery of the software product. This issue can be a threat to the individual sustainability of software engineers and it can be addressed by situational awareness. Therefore, the objective of the study is to identify situational factors concerning the MCR process. Systematic Literature Review (SLR) has been used to identify situational factors. Data coding along with continuous comparison and memoing procedures of grounded theory and expert review has been used to produce an exclusive and validated list of situational factors grouped under categories. The study results conveyed 23 situational factors that are grouped into 5 broad categories i.e. People, Organization, Technology, Source Code and Project. The study is valuable for researchers to extend the research and for software engineers to identify situations and sustain for longer.
- Published
- 2020
- Full Text
- View/download PDF
15. Knowledge sharing, a key sustainable practice is on risk: An insight from Modern Code Review
- Author
-
Nargis Fatima, Suriayati Chuprat, and Sumaira Nazir
- Subjects
Code review ,Knowledge management ,business.industry ,Computer science ,media_common.quotation_subject ,Context (language use) ,computer.software_genre ,Asset (computer security) ,Grounded theory ,Knowledge sharing ,Systematic review ,Quality (business) ,Product (category theory) ,business ,computer ,media_common - Abstract
Knowledge is the foremost asset to succeed in the area of green sustainable software engineering. Effective knowledge sharing is a crucial objective of companies. However, individuals are not motivated to share knowledge. This study explores knowledge sharing in the context of Modern Code Review (MCR) where engineers assess source code composed by other developers to improve the product quality through knowledge exchange among developers (Author and reviewer). The procedure is profoundly subject to individual associations among developers regarding contribute to the outcomes of MCR such as identifying alternative solutions, code improvement, and knowledge sharing. Knowledge sharing is supposed as an important outcome of MCR; however, developers involved in MCR activities are confronted with a lack of knowledge sharing challenge due to a dearth of knowledge sharing motivation. There are different individual factors that might encourage or hinder developers to engage in knowledge sharing. Studies have been performed dependent on information investigation gathered from surveys, interviews and review tools concerning elements affecting MCR, still the literature lacks research regarding factors impacting knowledge sharing aspect in MCR. Hence, this examination performs a Systematic Literature Review (SLR) that distinguish individual factors impacting knowledge sharing in MCR. The examination discoveries 34 individual factors such as biasness, cognitive load, time pressure, expertise, etc. The factors are grouped into 7 broad categories utilizing grounded theory. Study discoveries have suggestions for programming specialists involved in MCR activities and researchers interested to perform research in this area to broaden the examination and perform MCR exercises adequately by considering and beating the negative effect of recognized variables.
- Published
- 2019
- Full Text
- View/download PDF
16. Situational factors affecting Software Engineers Sustainability: A Vision of Modern Code Review
- Author
-
Sumaira Nazir, Nargis Fatima, and Suriayati Chuprat
- Subjects
Code review ,Source code ,business.industry ,Computer science ,media_common.quotation_subject ,computer.software_genre ,Grounded theory ,Software quality ,Knowledge sharing ,Engineering management ,Software ,Situational ethics ,business ,computer ,Competence (human resources) ,media_common - Abstract
Code review is a sustainable software engineering practice to assure software quality. Presently it is performed by software engineers with the help of tool support, for instance, Gerrit and termed as Modern Code Review (MCR). It is a lightweight way to deal with distinguishing abandons from source code. Software engineer's efficiency while performing MCR activities is exposed to varying people related situational factors that impact the software engineer's sustainability. The literature lacks the identification of people related situational factors in MCR. For successful MCR results, there is a need to distinguish people concerning situational factors to enhance the productivity and competence of software engineers for their sustainability. Therefore, this investigation intends to recognize situational variables influencing the ability and efficiency of software engineers. The systematic methodology utilized for this examination is Systematic Literature Review (SLR). The study results reported 52 situational factors. The situational factors are assembled into four fundamental classifications such as team, team interaction, reviewer response and knowledge sharing. The grounded theory approach has been applied to get final unique list of people related situational factors and sub factors. The investigation will be useful for expert performing examination seeing situational factors as well as the sustainability of software engineers engaged with MCR activities.
- Published
- 2019
- Full Text
- View/download PDF
17. Understanding the Impact of Feedback on Knowledge Sharing in Modern Code Review
- Author
-
Nargis Fatima, Sumaira Nazir, and Suriayati Chuprat
- Subjects
Code review ,Knowledge management ,Conceptualization ,business.industry ,Computer science ,computer.software_genre ,Constructive ,Grounded theory ,Software quality ,Knowledge sharing ,Formative assessment ,Systematic review ,business ,computer - Abstract
Modern Code Review (MCR) is a key practice to improve code quality and share knowledge among authors and reviewers. Feedback from the reviewer is an important aspect of the review process that initiates the process of knowledge sharing. In MCR the knowledge sharing though reported as an important benefit, however also reported as a challenge for author and reviewer. It is argued that the reviewers do not share knowledge in the form of insightful, constructive and formative feedback. Developers are facing poor knowledge sharing challenge in addition to other MCR challenges. It is understood that the feedback from the reviewer is an important facet to share knowledge. There are various factors associated with the feedback provided by reviewers that hamper the knowledge sharing aspect in MCR process. There is a gap in the MCR literature concerning the exploration of feedback conceptualization, importance and its impacts on knowledge sharing. Therefore, this study explores the factors associated with feedback that impact knowledge sharing in MCR. Systematic Literature Review (SLR), grounded theory and expert opinion strategies are followed to recognize the unique, categorized and validated list of feedback concerning factors that impact knowledge sharing in MCR. The examination discovers 42 factors such as feedback usefulness, harsh comments, criticism on the author, broadcast feedback, etc. The identified 42 factors are grouped into 14 unique categories, for instance, feedback communication, feedback language, feedback content, feedback temporal aspects, etc. The performed study has suggestions for members involved in engaging in MCR activities and investigators attracted research this zone to widen the investigation and share knowledge adequately among team members by bearing in mind and coping with the negative effect of recognized variables and thus minimizing the production of software engineering wastes.
- Published
- 2019
- Full Text
- View/download PDF
18. Individual Sustainability Barriers and Mitigation Strategies: Systematic Literature Review Protocol
- Author
-
Sumaira Nazir, Nargis Fatima, and Suriayati Chuprat
- Subjects
Software Engineering Body of Knowledge ,Process management ,Software ,Systematic review ,business.industry ,Computer science ,Sustainability ,Sustainable practices ,Software system ,business ,Social dimension ,Protocol (object-oriented programming) - Abstract
Software engineering supports the development of software systems ensuring ease and success for humanity. However, humanity is facing various complex barriers contributing risk to societies, for instance, lack of sustainable software engineering practices, tools, and methodologies. As a software engineer, one can support and contribute to sustainability by developing sustainable software products through sustainable practices along with managing economic, environmental, social, technical and individual dimensions of sustainability. Software engineering with sustainable practices allows for the development of sustainable software. Sustainability concerning economic, environmental, technical and social dimensions have been well explored in literature but individual sustainability dimensions need insight from researchers. Therefore, there is a need to research from an individual sustainability perspective, so this study presents the Systematic Literature Review (SLR) protocol to identify the individual sustainability barriers along with their mitigation strategies. This research will be helpful for a sustainable software engineering body of knowledge and researchers interested to contribute for sustainability concerns in software engineering, by providing the SLR protocol to identify the prioritized list of barriers and mitigation strategies concerning individual sustainability.
- Published
- 2019
- Full Text
- View/download PDF
19. A Review on Cloud Computing Acceptance Factors
- Author
-
Suriayati Chuprat, Roslina Ibrahim, and Mohd Talmizie Amron
- Subjects
Government ,Process management ,Higher education ,business.industry ,Computer science ,05 social sciences ,Public sector ,Cloud computing ,02 engineering and technology ,Private sector ,Phase (combat) ,020204 information systems ,0502 economics and business ,Health care ,0202 electrical engineering, electronic engineering, information engineering ,General Earth and Planetary Sciences ,business ,Cloud storage ,050203 business & management ,General Environmental Science - Abstract
Cloud computing technology is regarded as a highly useful application for organization due to advantages such as long-term cost saving, easy access of data at any given time and economically. In fact, there are lots of free cloud storage provider these days whereby user can easily store and shared their organizational data easily and efficiently. Both government and private sectors are looking into optimizing their cloud data storage. However, some of them are still in the early implementation phase with issues on both technological and human factors need to be properly taken care of to ensure its success. In the case of Malaysia our Government highly encouraged the use of cloud computing technology. Given that it is in the early implementation phase it is important to understand the factors that contribute to the use of cloud computing. This study, review studies done in health. Higher learning institution and public sector. It was found that the most contributing factors for use or cloud computing are technology readiness, human readiness, organization support, environment and, security and privacy. We hope that this study will help strengthen our knowledge and readiness in implementation of cloud computing.
- Published
- 2017
- Full Text
- View/download PDF
20. Adopting ISO/IEC 27005:2011-based Risk Treatment Plan to Prevent Patients Data Theft
- Author
-
Yazriwati Yahya, Suriayati Chuprat, Mohd Naz'ri Mahrin, Laura Cassandra Hamit, Nurulhuda Firdaus Mohd Azmi, and Haslina Md Sarkan
- Subjects
General Computer Science ,business.industry ,Risk management framework ,General Engineering ,Data security ,Audit ,Data breach ,Information security ,Risk analysis (engineering) ,General Agricultural and Biological Sciences ,business ,Risk assessment ,Risk management ,Countermeasure (computer) - Abstract
The concern raised in late 2017 regarding 46.2 million mobile device subscribers’ data breach had the Malaysian police started an investigation looking for the source of the leak. Data security is fundamental to protect the assets or information by providing its confidentiality, integrity and availability not only in the telecommunication industry but also in other sectors. This paper attempts to protect the data of a patient-based clinical system by producing a risk treatment plan for its software products. The existing system is vulnerable to information theft, insecure databases, needy audit login and password management. The information security risk assessment consisting of identifying risks, analyzing and evaluating them were conducted before a risk assessment report is written down. A risk management framework was applied to the software development unit of the organization to countermeasure these risks. ISO/IEC 27005:2011 standard was used as the basis for the information security risk management framework. The controls from Annex A of ISO/IEC 27001:2013 were used to reduce the risks. Thirty risks have been identified and 7 high-level risks for the product have been recognized. A risk treatment plan focusing on the risks and controls has been developed for the system to reduce these risks in order to secure the patients’ data. This will eventually enhance the information security in the software development unit and at the same time, increase awareness among the team members concerning risks and the means to handle them.
- Published
- 2020
- Full Text
- View/download PDF
21. Fuzzy Delphi Method for Evaluating HyTEE Model (Hybrid Software Change Management Tool with Test Effort Estimation)
- Author
-
Suriayati Chuprat, Nurulhuda Firdaus, and Mazidah Mat Rejab
- Subjects
General Computer Science ,Traceability ,Cost estimate ,business.industry ,Computer science ,Change request ,Software maintainer ,05 social sciences ,Change management ,Code coverage ,Artifact (software development) ,010501 environmental sciences ,01 natural sciences ,Software quality ,Test effort ,Software ,0502 economics and business ,Regression testing ,Change management (engineering) ,Software system ,Software engineering ,business ,050203 business & management ,0105 earth and related environmental sciences ,Software configuration management - Abstract
When changes are made to a software system during development and maintenance, they need to be tested again i.e. regression test to ensure that changes behave as intended and have not impacted the software quality. This research will produce an automated tool that can help the software manager or a maintainer to search for the coverage artifact before and after a change request. Software quality engineer can determine the test coverage from new changes which can support cost estimation, effort, and schedule estimation. Therefore, this study is intended to look at the views and consensus of the experts on the elements in the proposed model by benefitting the Fuzzy Delphi Method. Through purposive sampling, a total of 12 experts from academic and industrial have participated in the verification of items through 5-point linguistic scales of the questionnaire instrument. Outcome studies show 90% of elements in the proposed model consists of change management, traceability support, test effort estimation support, regression testing support, report and GUI meet, the value threshold (d construct) is less than 0.2 and the percentage of the expert group is above 75%. It is shown that elements of all the items contained in the venue are needed in the HyTEE Model (Hybrid Software Change Management Tool with Test Effort Estimation) based on the consensus of experts.
- Published
- 2019
- Full Text
- View/download PDF
22. Sustainable Software Engineering:A Perspective of Individual Sustainability
- Author
-
Nurulhuda F, Haslina Md Sarkan, Sumaira Nazir, Suriayati Chuprat, Nargis Fatima, and Nilam Nur Amir Sjarif
- Subjects
General Computer Science ,business.industry ,Computer science ,General Engineering ,020207 software engineering ,02 engineering and technology ,Knowledge sharing ,Software Engineering Body of Knowledge ,Software development process ,Software ,Systematic review ,020204 information systems ,Sustainability ,0202 electrical engineering, electronic engineering, information engineering ,Domain knowledge ,Dimension (data warehouse) ,General Agricultural and Biological Sciences ,Software engineering ,business - Abstract
Sustainable software engineering is a mean of developing sustainable software with sustainable software engineering process activities while balancing its various dimensions for instance economic, environmental, social, technical and individual. It is conveyed that the economic, technical, environmental and social dimensions are explored to satisfactory degree however the individual dimension of sustainable software engineering which is concerned with wellbeing of software engineers is not explored to satisfactory degree with respect to its understanding and challenges. Therefore, the aim of the study is to highlight and prioritize the challenges regarding individual sustainability dimension. The study also provides the mitigation strategies for the top five individual sustainability challenges. The systematic literature review has been performed to report the challenges and mitigation strategies. The study finding shows that lack of domain knowledge, lack of methodologies and tool support, lack of education, varying and unidentified situations and lack of sustainable software engineering practices are top most challenges regarding individual sustainability. These challenges need an urgent attention to achieve the goal of sustainable software engineering. The study also reports various mitigation strategies to overcome the risk of identified top most individual sustainability challenges such as to introduce sustainable software engineering education and knowledge in software engineering curricula, development of knowledge sharing frameworks and awareness regarding unclear and varying situations for each software engineering activity etc. The study will be beneficial for sustainable software engineering body of knowledge, sustainable software engineering practitioners and researchers by providing classified list of individual sustainability challenges and their mitigation strategies.
- Published
- 2020
- Full Text
- View/download PDF
23. Support Vector Machine Algorithm for SMS Spam Classification in The Telecommunication Industry
- Author
-
Yazriwati Yahya, Nurulhuda Firdaus Mohd Azmi, Suriayati Chuprat, and Nilam Nur Amir Sjarif
- Subjects
Short Message Service ,General Computer Science ,Computer science ,business.industry ,General Engineering ,Telecommunications service ,Sms spam ,Support vector machine ,Naive Bayes classifier ,ComputingMethodologies_PATTERNRECOGNITION ,Cohen's kappa ,Support vector machine algorithm ,General Agricultural and Biological Sciences ,Telecommunications ,business ,Classifier (UML) - Abstract
In recent years, we have withnessed a dramatic increment volume in the number of mobile users grows in telecommunication industry. However, this leads to drastic increase to the number of spam SMS messages. Short Message Service (SMS) is considered one of the widely used communication in telecommunication service. In reality, most of the users ignore the spam because of the lower rate of SMS and limited amount of spam classification tools. In this paper, we propose a Support Vector Machine (SVM) algorithm for SMS Spam Classification. Support Vector Machine is considered as the one of the most effective for data mining techniques. The propose algorithm have been evaluated using public dataset from UCI machine learning repository. The performance achieved is compared with other three data mining techniques such as Naive Bayes, Multinominal Naive Bayes and K-Nearest Neighbor with the different number of K= 1,3 and 5. Based on the measuring factors like higher accuracy, less processing time, highest kappa statistics, low error and the lowest false positive instance, it’s been identified that Support Vector Machines (SVM) outperforms better than other classifiers and it is the most accurate classifier to detect and label the spam messages with an average an accuracy is 98.9%. Comparing both the error parameter overall, the highest error has been found on the algorithm KNN with K=3 and K=5. Whereas the model with less error is SVM followed by Multinominal Naive Bayes. Therefore, this propose method can be used as a best baseline for further comparison based on SMS spam classification.
- Published
- 2020
- Full Text
- View/download PDF
24. Mobile Business Intelligence Acceptance Model for Organisational Decision Making
- Author
-
Haslina Md Sarkan, Yazriwati Yahya, Nurulhuda Firdaus Mohd Azmi, Lim Yee Fang, Suriayati Chuprat, and Nilam Nur Amir Sjarif
- Subjects
Control and Optimization ,Knowledge management ,Computer Networks and Communications ,Computer science ,Dashboard (business) ,02 engineering and technology ,Business intelligence ,Organizational decision making ,020204 information systems ,0502 economics and business ,0202 electrical engineering, electronic engineering, information engineering ,Computer Science (miscellaneous) ,Electrical and Electronic Engineering ,Instrumentation ,Mobile BI acceptance model ,business.industry ,05 social sciences ,Novelty ,Mobile business intelligence ,Hardware and Architecture ,Control and Systems Engineering ,Order (business) ,Technology acceptance model ,Metric (unit) ,Performance indicator ,business ,Mobile device ,050203 business & management ,Information Systems - Abstract
Mobile Business Intelligence (BI) is the ability to access BI-related data such as key performance indicators (KPIs), business metric and dashboard through mobile device. Mobile BI addresses the use-case of remote or mobile workers that need on-demand access to business-critical data. User acceptance on mobile BI is an essential in order to identify which factors influence the user acceptance of mobile BI application. Research on mobile BI acceptance model on organizational decision-making is limited due to the novelty of mobile BI as newly emerged innovation. In order to answer gap of the adoption of mobile BI in organizational decision-making, this paper reviews the existing works on mobile BI Acceptance Model for organizational decision-making. Two user acceptance models which are Technology Acceptance Model and Technology Acceptance Model for Mobile Services will be review. Realizing the essential of strategic organizational decision-making in determining success of organizations, the potential of mobile BI in decision-making need to be explore. Since mobile BI still in its infancy, there is a need to study user acceptance and usage behavior on mobile BI in organizational decision-making. There is still opportunity for further investigate the impact of mobile BI on organizational decision-making.
- Published
- 2018
25. Malware Forensic Analytics Framework Using Big Data Platform
- Author
-
Pritheega Magalingam, Shamsul Sahibuddin, Firham M. Senan, Mohd. Zabri Adil Talib, Noor Azurati Ahmad, Suriayati Chuprat, Ganthan Narayana, Syahid Anuar, Mohd Naz'ri Mahrin, and Aswami Ariffin
- Subjects
business.industry ,Emerging technologies ,Computer science ,Digital forensics ,Big data ,020207 software engineering ,02 engineering and technology ,computer.software_genre ,Data science ,Data warehouse ,Visualization ,Analytics ,0202 electrical engineering, electronic engineering, information engineering ,Data analysis ,Malware ,020201 artificial intelligence & image processing ,business ,computer - Abstract
The dramatically increased threats such as malware attacks to our cyber world have given us the vital sign to strengthen the security in a more proactive way. Thus, in recent research we proposed an integrated malware forensic analytics framework that will expose the future threats of malware attacks. This framework incorporates malware collections, malware analytics and visualization of discovered malware attacks. In this paper, we present the design and implementation of the framework which focuses on analytics and visualization, and utilized the emerging technology of big data platform. The implementation of the framework shows promising results in presenting descriptive analytics and predicting the future attacks using machine learning algorithms. We also demonstrate the feasibility of Hortonworks Cybersecurity Package (HCP) in supporting the proposed framework. Finally, we discussed the future work that can be further investigated in improving the implementation of the framework.
- Published
- 2018
- Full Text
- View/download PDF
26. Challenges and Benefits of Modern Code Review-Systematic Literature Review Protocol
- Author
-
Suriayati Chuprat, Nargis Fatima, and Sumaira Nazir
- Subjects
Code review ,Computer science ,business.industry ,Static program analysis ,computer.software_genre ,Software quality ,Knowledge sharing ,Engineering management ,Software ,Software quality assurance ,Software inspection ,Open-source software development ,business ,computer - Abstract
Software quality is major concern of all the stakeholders either acquirer of the software or supplier of the software. Software inspection is the static verification technique that can contribute to the quality development of the software. As the development trend is focused towards open source software development the smart and light weight techniques are emerged. One of them is modern code review, a light weight software inspection technique. The modern code review is informal, static code analysis performed by small team using collaborative tools. It is argued that modern code review provides benefits other than defect detection and is also subject to various challenges such as lack of trust, communication issues, effective knowledge sharing, personal conflicts and varying situations etc. There has been little research with respect to systematizing and identifying the challenges faced by, and benefits gained by Modern Code Review (MCR) team. There is dire need to recognize and organize these challenges and benefits based on modern code review literature. The aim is to highlight and organize the challenges and benefits regarding modern code review context. It will be beneficial for software quality assurance practitioners and researchers by providing classified list of challenges and benefits.
- Published
- 2018
- Full Text
- View/download PDF
27. A Review for Improving Software Change using Traceability Model with Test Effort Estimation
- Author
-
Suriayati Chuprat, Nurulhuda Firdaus Mohd Azmi, and Mazidah Mat Rejab
- Subjects
Test effort ,Software ,Traceability ,business.industry ,Computer science ,Regression testing ,Software development ,Test suite ,Software system ,Software maintenance ,business ,Reliability engineering - Abstract
Maintaining a software system includes tasks such as fixing defects, adding new features, or modifying the software (software changes) to accommodate different environments. Then, the modified software system needs to be tested, to ensure the changes will not having any adverse effects on the previously validated code. Regression testing is one of the approaches which software tester used to test the software system. The traditional regression testing strategy was to repeat all the previous tests and retesting all the features of the program even for small modifications. For programming with thousand lines of codes (LOC), the cost of retesting the entire system is expensive if attempted after every change. This practice is becoming increasingly difficult because of the demand for testing the new functionalities and correcting errors with limited resources. Numerous techniques and tools have been proposed and developed to reduce the costs of regression testing and to aid regression testing processes, such as test suite reduction, test case prioritization, and test case done on the thresholds and weightings used in regression testing. However, there is still need to study on the software traceability model of coverage analysis in software changes during regression testing and test effort estimation on regression testing. Hence, this paper describes the proposal for improving software changes with hybrid traceability model and test effort estimation during regression testing. We will explain our proposed work including the problem background, the intended research objectives, literature review and plan for future implementation. This study is expected to contribute in developing hybrid traceability model for large software development project to support software changes during regression testing with test estimation approach and expected to reduce operational cost during the implementation on software maintenance. Also, it is hoped that an efficient and improve solution to regression testing can be realized, thus, gives the benefits to software testers and project manager manage the software maintenance task since it is a critical part in software project development.
- Published
- 2018
- Full Text
- View/download PDF
28. Social Networking Sites Habits and Addiction Among Adolescents in Klang Valley
- Author
-
Nurazean Maarop, Suriayati Chuprat, Roslina Ibrahim, Yazriwati Yahya, Nor Zairah Ab. Rahim, and Haslina Md Sarkan
- Subjects
General Computer Science ,Social network ,business.industry ,Computer science ,media_common.quotation_subject ,Addiction ,Sample (statistics) ,Habit ,business ,Social psychology ,media_common - Abstract
Social networking sites (SNS) is a very popular application in today’s world society. SNS, to certain extent has change the way people communicate with each other. This kind of technology has become a trend among the users regardless the impact of the technology to the users either positive or negative. The level of SNS usage among the adolescents has started to raise concern among the parents and also the society. SNS addictions are becoming problematic in certain countries especially in United States and lately this issue has started to spread all over the world. Malaysia is also one of the country affected with SNS addiction. SNS addiction is not an isolated phenomenon as it is started from high engagement on the SNS usage and it originates from habitual behavior. Therefore, it is important to seek and understand habit and addiction of SNS among adolescents in Malaysia. The purpose of this study is to analyze and explore the usage of SNS among the adolescents in Malaysia, specifically in Klang Valley. It examines the SNS usage behavioral, which is habit and addiction. The data was collected from a sample of 60 respondents using an online survey. The data were analyzed using SPSS for descriptive analysis. From the analysis, it was found that most of the adolescents used SNS in daily basis and majority of them use it for more than two hours per day. Patterns on habits and addiction on the SNS usage shows that some adolescents experienced certain habit and addiction behavior.
- Published
- 2018
- Full Text
- View/download PDF
29. Improving the accuracy of complex activities recognition using accelerometer-embedded mobile phone classifiers
- Author
-
Teddy Mantoro, Mohammed Mobark, and Suriayati Chuprat
- Subjects
Activity recognition ,business.industry ,Computer science ,Mobile phone ,Feature extraction ,Artificial intelligence ,business ,Machine learning ,computer.software_genre ,Accelerometer ,computer ,Classifier (UML) - Abstract
Using mobile phones for Human Activities Recognition (HAR) is very helpful in observing daily habit of the user and early detecting health diseases or accidents. Many studies have been published which have investigated the HAR with the help of mobile phones. However, these studies mainly focused on simple single locomotion activities. In real-world situations, human activities are often performed in complex manners. This study will investigate recognizing of complex activities with common classifiers those using in recognizing human activities. Data was collected about complex activities, then features were extract, finally the activities were classified. The experiment shows that the recognition accuracy of low level activities is higher than high level in all seven classifiers. Also it was noticed that the highest accuracy was got by IBK classifier (KNN). It got the highest accuracy in both positions, and in the three activity levels. Finally, the activities with armband position got more accuracy in all seven classifiers than in waist position. The study concluded that those classifiers are good to recognize low level activities (simple), but their performance reduces when the complexity of activities increase. So proper classifiers are needed to deal with the complex activities.
- Published
- 2017
- Full Text
- View/download PDF
30. The challenges of Extract, Transform and Loading (ETL) system implementation for near real-time environment
- Author
-
Othman Mohd Yusop, Adilah Sabtu, Saiful Adli Ismail, Haslina Md Sarkan, Nurulhuda Firdaus Mohd Azmi, Suriayati Chuprat, and Nilam Nur Amir Sjarif
- Subjects
Transaction processing ,Computer science ,business.industry ,020208 electrical & electronic engineering ,Real-time computing ,Big data ,02 engineering and technology ,computer.software_genre ,Data type ,Data warehouse ,020204 information systems ,High availability ,Scalability ,0202 electrical engineering, electronic engineering, information engineering ,business ,Implementation ,computer ,Data integration - Abstract
Organization with considerable investment into data warehousing, the influx of various data types and forms requires certain ways of prepping data and staging platform that support fast, efficient and volatile data to reach its targeted audiences or users of different business needs. Extract, Transform and Load (ETL) system proved to be a choice standard for managing and sustaining the movement and transactional process of the valued big data assets. However, traditional ETL system can no longer accommodate and effectively handle streaming or near real-time data and stimulating environment which demands high availability, low latency and horizontal scalability features for functionality. This paper identifies the challenges of implementing ETL system for streaming or near real-time data which needs to evolve and streamline itself with the different requirements. Current efforts and solution approaches to address the challenges are presented. The classification of ETL system challenges are prepared based on near real-time environment features and ETL stages to encourage different perspectives for future research.
- Published
- 2017
- Full Text
- View/download PDF
31. Web Crime Mining by Means of Data Mining Techniques
- Author
-
Suriayati Chuprat, Javid Hosseinkhani Naniz, Suhaimi Ibrahim, and Javad Hosseinkhani
- Subjects
Engineering ,General Computer Science ,business.industry ,Process (engineering) ,General Engineering ,computer.software_genre ,Data science ,Field (computer science) ,Text mining ,Web mining ,Data mining ,Crime data ,business ,computer - Abstract
The purpose of this study is to provide a review to mining useful information by means of Data Mining. The procedure of extracting knowledge and information from large set of data is data mining that applying artificial intelligence method to find unseen relationships of data. There is more study on data mining applications that attracted more researcher attention and one of the crucial field is criminology that applying in data mining which is utilized for identifying crime characteristics. Detecting and exploring crimes and investigating their relationship with criminals are involved in the analyzing crime process. Criminology is a suitable field for using data mining techniques that shows the high volume and the complexity of relationships between crime datasets. Therefore, for further analysis development, the identifying crime characteristic will be the first step and obtained knowledge from data mining approaches is a very useful tool to help and support police forces. This research aims to provide a review to extract useful information by means of Data Mining, in order to find crime hot spots out and predict crime trends for them using crime data mining techniques.
- Published
- 2014
- Full Text
- View/download PDF
32. DESIGN AND IMPLEMENTATION OF A PRIVACY PRESERVED OFF-PREMISES CLOUD STORAGE
- Author
-
Suriayati Chuprat, Jamalul-lail Ab Manan, Mervat Adib Bamiah, and Sarfraz Nawaz Brohi
- Subjects
Service (systems architecture) ,Cloud computing security ,Computer Networks and Communications ,Computer science ,business.industry ,Privacy policy ,Internet privacy ,Cloud computing ,Audit ,Computer security ,computer.software_genre ,Metadata ,Artificial Intelligence ,Backup ,business ,Cloud storage ,computer ,Software - Abstract
Despite several cost-effective and flexible charact eristics of cloud computing, some clients are reluc tant to adopt this paradigm due to emerging security and pr ivacy concerns. Organization such as Healthcare and Payment Card Industry where confidentiality of info rmation is a vital act, are not assertive to trust the security techniques and privacy policies offered by cloud service providers. Malicious attackers have violated the cloud storages to steal, view, manipul ate and tamper client’s data. Attacks on cloud storages are extremely challenging to detect and mi tigate. In order to formulate privacy preserved cloud storage, in this research paper, we propose a n improved technique that consists of five contributions such as Resilient role-based access c ontrol mechanism, Partial homomorphic cryptography, metadata generation and sound stegano graphy, Efficient third-party auditing service, Data backup and recovery process. We implemented these components using Java Enterprise Edition with Glassfish Server. Finally we evaluated our pro posed technique by penetration testing and the results showed that client’s data is intact and pro tected from malicious attackers.
- Published
- 2014
- Full Text
- View/download PDF
33. Proposing a Framework for Exploration of Crime Data Using Web Structure and Content Mining
- Author
-
Hamed Taherdoost, Javad Hosseinkhani, Suriayati Chuprat, Amin Shahraki Moghaddam, and Hadi Barani Baravati
- Subjects
Structure (mathematical logic) ,Engineering ,General Computer Science ,business.industry ,Process (engineering) ,Digital data ,General Engineering ,Law enforcement ,ComputingMilieux_LEGALASPECTSOFCOMPUTING ,Data science ,Set (abstract data type) ,World Wide Web ,Scalability ,business ,Web crawler ,Reliability (statistics) - Abstract
The purpose of this study is to propose a framework and implement High-level architecture of a scalable universal crawler to maintenance the reliability gap and present the evaluation process of forensic data analysis criminal suspects. In Law enforcement agencies, criminal web data provide appropriate and anonymous information. Pieces of information implemented the digital data in the forensic analysis to accused social networks but the assessment of these information pieces is so difficult. In fact, the operator manually should pull out the suitable information from the text in the website and find the links and classify them into a database structure. In consequent, the set is ready to implement a various criminal network evaluation tools for testing. As a result, this procedure is not efficient because it has many errors and the quality of obtaining the analyzed data is based on the expertise and experience of the investigator subsequently the reliability of the tests is not constant. Therefore, the better result just comes from the knowledgeable operator. The objectives of this study is to show the process of investigating the criminal suspects of forensic data analysis to maintenance the reliability gap by proposing a structure and applying High-level architecture of a scalable universal crawler.
- Published
- 2013
- Full Text
- View/download PDF
34. Parameters Consideration in Designing a Magnetorheological Damper
- Author
-
M.J. Mughni, Hairi Zamzuri, Izyan Iryani Mohd Yazid, Saiful Amri Mazlan, and Suriayati Chuprat
- Subjects
Engineering ,business.industry ,Mechanical Engineering ,Circuit design ,Mechanical engineering ,Structural engineering ,Piston rod ,Physics::Classical Physics ,Finite element method ,Damper ,law.invention ,Magnetic circuit ,Piston ,Mechanics of Materials ,law ,Magnetorheological fluid ,General Materials Science ,Magnetorheological damper ,business - Abstract
This paper presents a simulation study of electromagnetic circuit design for a mixed mode Magnetorheological (MR) damper. The magnetic field generated by electromagnetic circuit of the MR damper was simulated using Finite Element Method Magnetics (FEMM) software package. All aspects of geometry parameters were considered and adjusted efficiently in order to obtain the best MR damper performance. Eventually, six different parameters approach were proposed; the selection of materials, the polarity of coils, the diameter of piston, piston rod and core, the shear and squeeze gaps clearance, the piston pole length and the thickness of housing.
- Published
- 2013
- Full Text
- View/download PDF
35. ABCD rules segmentation on malignant tumor and benign skin lesion images
- Author
-
Suriayati Chuprat, Nurulhuda Firdaus Mohd Azmi, Yazriwati Yahya, and Haslina Md Sarkan
- Subjects
021110 strategic, defence & security studies ,Dermatoscopy ,integumentary system ,medicine.diagnostic_test ,business.industry ,0211 other engineering and technologies ,Pattern recognition ,Image processing ,Feature selection ,02 engineering and technology ,Image segmentation ,medicine.disease ,Variegation (histology) ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,020201 artificial intelligence & image processing ,Segmentation ,Computer vision ,False alarm ,Birthmark ,Artificial intelligence ,business - Abstract
Skin lesion is defined as a superficial growth or patch of the skin that is visually different than its surrounding area. Skin lesions appear for many reasons such as the symptoms indicative of diseases, birthmarks, allergic reactions, and so on. Images of skin lesions are analyzed by computer to capture certain features to be characteristic of skin diseases. These activities can be defined as automated skin lesion diagnosis (ASLD). ASLD involves five steps including image acquisition, pre-processing to remove occluding artifacts (such as hair), segmentation to extract regions of interest, feature selection and classification. This paper present analysis of automated segmentation called the ABCD rules (Asymmetry, Border irregularity, Color variegation, Diameter) in image segmentation. The experiment was carried on Malignant tumor and Benign skin lesion images. The study shows that the ABCD rules has successfully classify the images with high value of total dermatoscopy score (TDS). Although some of the analysis shows false alarm result, it may give the significant input to search suitable segmentation measure.
- Published
- 2016
- Full Text
- View/download PDF
36. Acceptance of cloud computing in the Malaysian public sector: A proposed model
- Author
-
Mohd Talmizie Amron, Nur Azaliah Abu Bakar, Suriayati Chuprat, and Roslina Ibrahim
- Subjects
Organizational Behavior and Human Resource Management ,Government ,lcsh:Management. Industrial management ,business.industry ,05 social sciences ,Public sector ,Cloud computing ,Management Science and Operations Research ,Work environment ,Individual analysis ,lcsh:HD28-70 ,0502 economics and business ,050211 marketing ,business ,Telecommunications ,Unified communications ,050203 business & management - Abstract
The Malaysian government has initiated a cloud government project as an integration of cloud computing and unified communication-based applications toward the digital and cloud work environment. However, the impact studies have found that the implementation of this project has several weaknesses such as lack of infrastructure support, weak IT knowledge, and lack of awareness among public sector employees causing applications not to be fully utilized. Therefore, it is crucial to conduct a study to measure the acceptance of government cloud project because there has been much investment in the project. This study applied Unified Theory of Acceptance and Use of Technology (UTAUT), Technology Readiness Index (TRI) and several factors to develop the research model which is divided into two main factors: technological and human. The technological factor might determine the likelihood of its acceptance by the public sector and might stimulate them to accept it. The human factor as the characteristics of the people in the public sector that may contribute to creating the need for and ability to accept cloud computing. This proposed model will be used to evaluate the individual acceptance of cloud computing in the Malaysian public sector. For future work, this model needs to be enriched with interview sessions and quantitative surveys to validate the findings.
- Published
- 2019
- Full Text
- View/download PDF
37. Secured data partitioning in multi cloud environment
- Author
-
Suriayati Chuprat and Hazila Hasan
- Subjects
Information privacy ,Cloud computing security ,Computer science ,business.industry ,Data_MISCELLANEOUS ,Data security ,Cloud computing ,Computer security ,computer.software_genre ,Pure Data ,Work (electrical) ,Cloud testing ,Data partitioning ,business ,computer ,computer.programming_language - Abstract
Data security and privacy have become the biggest concern in cloud computing due to the fact that data can be compromised if cloud service storage is being attacked or during cloud service breakdown. Therefore, the adoption of data partitioning in multi cloud environment to address this issue has attracted academia and researchers' interest. Hence, this paper will discuss the concept of data partitioning, the current state-of-the-art of data partitioning and secured data partitioning in multi cloud environment. We then compared the current security approaches used to secure data partitioning in multi cloud environment. Lastly, we conclude that pure data partitioning alone is insufficient to address data security and privacy problem. Therefore, we will extend this issue for our future work.
- Published
- 2014
- Full Text
- View/download PDF
38. Situational requirement engineering framework for Global Software Development
- Author
-
Mohd Naz'ri Mahrin, Huma Hayat Khan, and Suriayati Chuprat
- Subjects
Identification (information) ,Software ,Knowledge management ,Requirements engineering ,business.industry ,Order (exchange) ,Computer science ,Process (engineering) ,Formal specification ,Software development ,Situational ethics ,business - Abstract
In order to have a successful software development in Global Software Development (GSD) environment, GSD community needs to define RE process by considering the situational characteristics. Currently Requirement Engineers (RE) are facing challenges in identification of the possible situational factors that can influence RE activities. There is a lack of such framework which can help requirement engineers in identifying the situational factors that affect the RE activities the most. In order to overcome this gap, we explored the situational factors that can influence RE activities. We conducted a survey in industry and performed a statistical analysis in order to identify the most influential factors, which were then formulated into situational RE framework. This framework not only helps RE process participants to identify the situational factors but also guides them to identify the most influential situational factors for each RE activity.
- Published
- 2014
- Full Text
- View/download PDF
39. Trusted cloud computing framework for healthcare sector
- Author
-
Sarfraz Nawaz Brohi, Mervat Adib Bamiah, Suriayati Chuprat, and Jamalul-lail Ab Manan
- Subjects
Cloud computing security ,Computer science ,business.industry ,Computer Networks and Communications ,Internet privacy ,Cloud computing ,Access control ,Data breach ,Trusted Computing ,Software, Cloud Computing, HIPAA, Security, Privacy, Trust ,Encryption ,Computer security ,computer.software_genre ,Trusted Network Connect ,Security controls ,Elasticity (cloud computing) ,Artificial Intelligence ,Direct Anonymous Attestation ,Trusted Platform Module ,business ,computer ,Software ,Hacker - Abstract
Cloud computing is rapidly evolving due to its effi cient characteristics such as cost-effectiveness, availability and elasticity. Healthcare organizatio ns and consumers lose control when they outsource t heir sensitive data and computing resources to a third p arty Cloud Service Provider (CSP), which may raise security and privacy concerns related to data loss and misuse appealing threats. Lack of consumers’ knowledge about their data storage location may lea d to violating rules and regulations of Health Insu rance Portability and Accountability Act (HIPAA) that can cost them huge penalty. Fear of data breach by int ernal or external hackers may decrease consumers’ trust i n adopting cloud computing and benefiting from its promising features. We designed a HealthcareTrusted Cloud Computing (HTCC) framework that maintains security, privacy and considers HIPAA regulations. HTCC framework deploys Trusted Computing Group (TCG) technologies such as Trusted Platform Module (TPM), Trusted Software Stack (TSS), virtual Trusted Platform Module (vTPM), Trusted Network Connect (TNC) and Self Encrypting Drives (SEDs). We emphasize on using strong multi-factor authentic ation access control mechanisms and strict security controls, as well as encryption for data at storage , in-transit and while process. We contributed in customizing a cloud Service Level Agreement (SLA) by considering healthcare requirements. HTCC was evaluated by comparing with previous researchers’ work and conducting survey from experts. Results wer e satisfactory and showed acceptance of the framework. We aim that our proposed framework will assist in optimizing trust on cloud computing to be adopted i n healthcare sector.
- Published
- 2014
40. Situational requirement engineering: A systematic literature review protocol
- Author
-
Mohd Naz'ri Mahrin, Suriayati Chuprat, and Huma Hayat Khan
- Subjects
Systematic review ,Knowledge management ,Requirements engineering ,Process (engineering) ,business.industry ,Computer science ,Formal specification ,Software development ,Situational ethics ,business ,Work related ,Protocol (object-oriented programming) - Abstract
Requirements Engineering (RE) is known to be one of the critical phases in software development. Lots of work related to RE is already published. Field of RE is maturing day by day, leading to exploration at its deeper level. It is argued that RE is subject to situational characteristics. This exposure becomes even more when RE is performed in global software development environment. There is a need to identify these situational characteristics based on RE literature. We plan to systematically explore situational RE based studies to distinguish and account state of the art in situational RE based reported research. This paper objective is to provide the systematic literature review (SLR) protocol to illustrate a process for combining the situational RE work that will ultimately present a state of the art of the field in global software development environment. This SLR aims to not only summarize the data related to situational RE in form of situational characteristics but will also be useful for RE practitioners specifically working in global software development environment by providing a check list base upon situational characteristics. It will also assist RE researchers to discover knowledge gaps to distinguish needs and probability for future research directions in the field of situational RE in global software development environment.
- Published
- 2013
- Full Text
- View/download PDF
41. Situational factors affecting Requirement Engineering process in Global Software Development
- Author
-
Mohd Naz'ri Mahrin, Huma Hayat Khan, and Suriayati Chuprat
- Subjects
Software development process ,Software Engineering Process Group ,Social software engineering ,Requirement ,Process management ,Requirements engineering ,Computer science ,business.industry ,Personal software process ,Functional requirement ,Software requirements ,Software engineering ,business - Abstract
A most favorable Requirement Engineering process is considered to be subject of situational characteristics of Software Requirement Engineering settings. These characteristics are related to organizations, project, process, requirements, stakeholders etc. However, list of situational factors affecting the Requirement Engineering process during Global Software Development is presently not available. The lack of such study is challenging as it not only restrain the ability to improve Requirement Engineering process, but also it can undermine the competence of Requirement Engineering team to determine the core constraints and characteristics of developing software. To address this scarcity, we have merged a considerable related research into an initial list of situational factors that affect the Requirement Engineering Process during Global Software Development. We have performed Systematic Literature Review for situational factors identification. To carry on data merging, we have used thorough data coding techniques adopted from Grounded Theory. The initial list of situational factors consists of 37 factors, which is the sound initial reference list for Requirement Engineering process definition in Global Software Development environment. The outcome of this study symbolizes a significant contribution to the Requirement Engineering body of knowledge.
- Published
- 2013
- Full Text
- View/download PDF
42. Zero-delay FPGA-based odd-even sorting network
- Author
-
Azizah Abdul Manaf, Nadia Parsazadeh, Amirshahram Hematian, and Suriayati Chuprat
- Subjects
Bitonic sorter ,Sorting algorithm ,business.industry ,Computer science ,Sorted array ,Sorting ,Parallel computing ,Odd–even sort ,External sorting ,Adaptive sort ,Data_FILES ,Sorting network ,business ,Computer hardware - Abstract
Sorting is one of the most well-known problems in computer science and is frequently used for benchmarking computer systems. It can contribute significantly to the overall execution time of a process in a computer system. Dedicated sorting architectures can be used to accelerate applications and/or to reduce energy consumption. In this paper, we propose an efficient sorting network aiming at accelerating the sorting operation in FPGA-based embedded systems. The proposed sorting network is implemented based on an Optimized Odd-even sorting method (O2) using fully pipelined combinational logic architecture and ring shape processing. Consequently, O2 generates the sorted array of numbers in parallel when the input array of numbers is given, without any delay or lag. Unlike conventional sorting networks, O2 sorting network does not need memory to hold data and information about sorting, and neither need input clock to perform the sorting operations sequentially. We conclude that by using O2 in FPGA-based image processing, we can optimize the performance of filters such as median filter which demands high performance sorting operations for realtime applications.
- Published
- 2013
- Full Text
- View/download PDF
43. A study on significance of adopting cloud computing paradigm in healthcare sector
- Author
-
Sarfraz Nawaz Brohi, Mervat Adib Bamiah, Suriayati Chuprat, and Jamalul-lail Ab Manan
- Subjects
Knowledge management ,Cloud computing security ,business.industry ,Computer science ,Health care ,Mobile computing ,eMix ,Cloud computing ,The Internet ,Smart card ,business ,Data science - Abstract
Healthcare sector is information critical industry that deals with human lives. Transforming from traditional paper-based to Electronic Health Records (EHRs) was not efficient enough since EHRs require resources, integration, maintenance and high cost implementation. Cloud computing paradigm offers flexible, cost effective, collaborative, multi-tenant infrastructure which assists in transforming electronic healthcare to smart healthcare that consists on the use of latest technologies such as smart mobiles, smart cards, robots, sensors and Tele-health systems via internet on pay-per-use basis for best medical practices. Cloud computing reduces the cost of EHRs in terms of ownership and IT maintenance, also it offers sharing, integration and management of EHRs as well as tracking patients and diseases more efficiently and effectively. This review paper represents the significance and opportunities for implementing cloud computing in healthcare sector.
- Published
- 2012
- Full Text
- View/download PDF
44. Cloud implementation security challenges
- Author
-
Sarfraz Nawaz Brohi, Mervat Adib Bamiah, Suriayati Chuprat, and Muhammad Nawaz Brohi
- Subjects
Information privacy ,Cloud computing security ,business.industry ,Service delivery framework ,Computer science ,Cloud computing ,Computer security ,computer.software_genre ,Security service ,Utility computing ,Software deployment ,Cloud testing ,Scalability ,The Internet ,Data as a service ,business ,computer - Abstract
Cloud computing offers significant features such as resource pooling, scalability, on-demand self service, availability, and reliability to organizations to improve their quality of services. For example by using cloud computing services in healthcare it is possible to reach large population of people in isolated geographical areas which will assist in saving their lives in critical situations. It enables the use of latest technologies through its various service delivery and deployment models via the internet on pay-per-use billing pattern. However, cloud computing has dark side when it comes to security and privacy considerations. Critical industries such as healthcare and banking are reluctant to trust cloud computing due to the fear of losing their sensitive data, as it resides on the cloud with no knowledge of data location and lack of transparency of Cloud Service Providers (CSPs) mechanisms used to secure their data and applications which have created a barrier against adopting this agile computing paradigm. This paper addresses cloud computing security concerns that must be considered in order to adopt cloud services in information critical industries.
- Published
- 2012
- Full Text
- View/download PDF
45. Towards an Efficient and Secure Educational Platform on cloud infrastructure
- Author
-
Mervat Adib Bamiah, Sarfraz Nawaz Brohi, Jamalul-lail Ab Manan, and Suriayati Chuprat
- Subjects
Flexibility (engineering) ,Cloud computing security ,business.industry ,Computer science ,Security as a service ,Cloud computing ,Intrusion detection system ,Trusted Computing ,Computer security ,computer.software_genre ,Information and Communications Technology ,Scalability ,Trusted Platform Module ,business ,computer - Abstract
Existing educational platforms are highly cost-consuming and inefficient in-terms of scalability, flexibility of infrastructures, availability, recovery, accessibility and security. Cloud computing is considered as a flexible business and technological model for providing an Efficient Educational Platform (EEP) due to its significant features. However, an EEP is not a complete solution required by Educational Organizations (EOs), they require Efficient as well as Secure Educational Platform (ESEP). Since cloud computing is an open access global technology, there are several security threats that might take place from malicious users. EOs are reluctant to trust and adopt this paradigm because of threats that can compromise the security of their confidential data. In this research paper, we describe the ICT related challenges faced by educational platforms and we emphasize the significance of cloud computing in enabling an EEP, finally we propose a conceptual model for developing an ESEP. Our contribution consists of several security tools and techniques such as Trusted Virtual Domains (TVDs), Security as a Service (SECaaS), Intrusion Detection Tools, Trusted Platform Module (TPM), virtual TPM (vTPM).
- Published
- 2012
- Full Text
- View/download PDF
46. Field programmable gate array system for real-time IRIS recognition
- Author
-
Amirshahram Hematian, Reza Khaleghparast, Azizah Abdul Manaf, Suriayati Chuprat, and Sepideh Yazdani
- Subjects
Biometrics ,urogenital system ,business.industry ,Computer science ,fungi ,Feature extraction ,Iris recognition ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,urologic and male genital diseases ,female genital diseases and pregnancy complications ,ComputingMethodologies_PATTERNRECOGNITION ,medicine.anatomical_structure ,Parallel processing (DSP implementation) ,Gate array ,medicine ,Computer vision ,cardiovascular diseases ,Pattern matching ,Artificial intelligence ,Iris (anatomy) ,Field-programmable gate array ,business - Abstract
Iris recognition is one of the most flawless recognition methods in biometrics. However, most of iris recognition algorithms are implemented based on sequential operations running on central processing units (CPUs). In this article we propose a prototype design for iris recognition based on field-programmable gate array (FPGA) in order to improve iris recognition performance by parallel computing. Time-consuming iris recognition sub-processes are fully implemented in parallel to achieve optimum performance. Unlike commonly used iris recognition methods that first capture a single image of an eye and then start the recognition process, we achieved to speed up the iris recognition process by localizing the pupil and the iris boundaries, unwrapping the iris image and extracting features of the iris image while image capturing was in progress. Consequently, live images from human eye can be processed continuously without any delay. We conclude that iris recognition acceleration by parallel computing can be a complete success when it is implemented on low-cost FPGAs.
- Published
- 2012
- Full Text
- View/download PDF
47. Using virtual machine monitors to overcome the challenges of monitoring and managing virtualized cloud infrastructures
- Author
-
Suriayati Chuprat, Mervat Adib Bamiah, and Sarfraz Nawaz Brohi
- Subjects
Software_OPERATINGSYSTEMS ,Cloud computing security ,Computer science ,Hardware virtualization ,business.industry ,Full virtualization ,Temporal isolation among virtual machines ,Hypervisor ,Cloud computing ,computer.software_genre ,Virtualization ,Virtual machine ,Embedded system ,Operating system ,business ,computer - Abstract
Virtualization is one of the hottest research topics nowadays. Several academic researchers and developers from IT industry are designing approaches for solving security and manageability issues of Virtual Machines (VMs) residing on virtualized cloud infrastructures. Moving the application from a physical to a virtual platform increases the efficiency, flexibility and reduces management cost as well as effort. Cloud computing is adopting the paradigm of virtualization, using this technique, memory, CPU and computational power is provided to clients' VMs by utilizing the underlying physical hardware. Beside these advantages there are few challenges faced by adopting virtualization such as management of VMs and network traffic, unexpected additional cost and resource allocation. Virtual Machine Monitor (VMM) or hypervisor is the tool used by cloud providers to manage the VMs on cloud. There are several heterogeneous hypervisors provided by various vendors that include VMware, Hyper-V, Xen and Kernel Virtual Machine (KVM). Considering the challenge of VM management, this paper describes several techniques to monitor and manage virtualized cloud infrastructures.© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
- Published
- 2012
- Full Text
- View/download PDF
48. Nurse scheduling with fairness criteria for public hospital
- Author
-
Nuzulha Khilwani Ibrahim, Habibollah Haron, Mohd Hakimi Aiman Ibrahim, Suriayati Chuprat, and Rabiah Ahmad
- Subjects
Schedule ,Knowledge management ,Job shop scheduling ,business.industry ,Process (engineering) ,media_common.quotation_subject ,InformationSystems_GENERAL ,Nurse scheduling problem ,Public hospital ,Health care ,Operations management ,business ,Psychology ,Duty ,media_common - Abstract
Nurses are the main players in healthcare organization who provide services to the community round the clock. Thus, nurse duty roster is one of the most important components in healthcare to make sure a hospital or clinic provides satisfaction to their patient. In usual practice, the duty roster is built manually or partly-automated, and the fairness criteria are often neglected. These situations may lead to dissatisfaction among nurses. In this paper, we report the results of our investigation upon the process of nurse scheduling done in few Malaysian public hospitals. We also conducted a survey to collect a list of the most desired fairness criteria in building the schedule. Results of survey are presented and discussed.
- Published
- 2011
- Full Text
- View/download PDF
49. Mouse movement behavioral biometric systems
- Author
-
Suhailan Safei, Rabiah Ahmad, Nazirah Abd Hamid, Siti Dhalila Mohd Satar, and Suriayati Chuprat
- Subjects
Identification (information) ,Matching (statistics) ,Biometric system ,Biometrics ,Human–computer interaction ,business.industry ,Movement (music) ,Computer science ,Pattern recognition (psychology) ,Access control ,User interface ,business ,Simulation - Abstract
A biometric system is a pattern-recognition system that recognizes a person based on either physiological or behavioral characteristics. This kind of system is used to provide access control to some valuable assets. In this paper, we proposed a behavioral biometric system that used random mouse movement in identifying a user. We developed a prototype of the proposed system and experiment it with a number of users. The experiment produced 14 matching or equal to 46.67% of successful matching. The results can be a preliminary result in observing the behavior of a user when the user was interacting with a random application by using a mouse.
- Published
- 2011
- Full Text
- View/download PDF
50. Histogram-Based Fruit Ripeness Identification Using Nearest-Neighbor Distance
- Author
-
Fatma Susilawati Mohamad, Suriayati Chuprat, and Azizah Abdul Manaf
- Subjects
Identification (information) ,business.industry ,Computer science ,Histogram ,Pattern recognition ,Artificial intelligence ,Ripeness ,business ,k-nearest neighbors algorithm - Published
- 2011
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.