1,323 results
Search Results
2. The sorting techniques: a tutorial paper on card sorts, picture sorts and item sorts.
- Author
-
Rugg, Gordon and McGeorge, Peter
- Subjects
- *
SORTING (Electronic computers) , *ELECTRONIC data processing , *COMPUTER programming , *INTELLIGENT tutoring systems , *PERSONAL construct theory - Abstract
Although sorting techniques (e.g. card sorts) are widely used in knowledge acquisition and requirements acquisition, they have received little formal attention compared to related techniques such as repertory grids and laddering. This paper briefly describes the main sorting techniques, and then provides a detailed tutorial on one variety (repeated single-criterion sorts), using a worked example. Guidelines for choice and sequencing of techniques are given, both in relation to varieties of sorting technique and in relation to other techniques. It is concluded that the sorting techniques are a valuable part of the elicitor's methodological toolkit. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
3. Livelihoods, conflict and aid programming: is the evidence base good enough?
- Author
-
Mallett, Richard and Slater, Rachel
- Subjects
SOCIAL conflict ,DISASTER relief ,COMPUTER programming ,PUBLIC welfare ,PEACEBUILDING - Abstract
In conflict-affected situations, aid-funded livelihood interventions are often tasked with a dual imperative: to generate material welfare benefits and to contribute to peacebuilding outcomes. There may be some logic to such a transformative agenda, but does the reality square with the rhetoric? Through a review of the effectiveness of a range of livelihood promotion interventions-from job creation to microfinance-this paper finds that high quality empirical evidence is hard to come by in conflict-affected situations. Many evaluations appear to conflate outputs with impacts and numerous studies fail to include adequate information on their methodologies and datasets, making it difficult to appraise the reliability of their conclusions. Given the primary purpose of this literature-to provide policy guidance on effective ways to promote livelihoods-this silence is particularly concerning. As such, there is a strong case to be made for a restrained and nuanced handling of such interventions in conflict-affected settings. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
4. A comparison of concurrent programming and cooperative multithreading<FN>A preliminary version of this paper appeared in Euro–Par 2000, held in Munich, Germany, August 2000 </FN>.
- Author
-
Keen, Aaron W., Ishihara, Takashi, Maris, Justin T., Tiejun Li, Fodor, Eugene F., and Olsson, Ronald A.
- Subjects
THREADS (Computer programs) ,COMPUTER multitasking ,ELECTRONIC data processing ,COMPUTER programming ,COMPUTER software - Abstract
This paper presents a comparison of the cooperative multithreading model with the general concurrent programming model. It focuses on the execution time performance of a range of standard concurrent programming applications. The overall results are mixed. In some cases, programs written in the cooperative multithreading model outperform those written in the general concurrent programming model. The contributions of this paper are twofold. First, it presents a thorough analysis of the performances of applications in the different models, i.e. to explain the criteria that determine when a program in one model will outperform an equivalent program in the other. Second, it examines the tradeoffs in writing programs in the different programming styles. In some cases, better performance comes at the cost of more complicated code. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
5. Strategies for the efficient exploitation of loop-level parallelism in Java<FN>Preliminary versions of the material presented in this paper appeared in the proceedings of the ACM 2000 Java Grande Conference and The Second Workshop on Java for High Performance Computing (ICS'2000). </FN>
- Author
-
Oliver, José, Guitart, Jordi, Ayguadé, Eduard, Navarro, Nacho, and Torres, Jordi
- Subjects
JAVA programming language ,INTERNET programming ,PROGRAM transformation ,THREADS (Computer programs) ,JAVASPACES technology ,COMPUTER programming - Abstract
This paper analyzes the overheads incurred in the exploitation of loop-level parallelism using Java Threads and proposes some code transformations that minimize them. The transformations avoid the intensive use of Java Threads and reduce the number of classes used to specify the parallelism in the application (which reduces the time for class loading). The use of such transformations results in promising performance gains that may encourage the use of Java for exploiting loop-level parallelism in the framework of OpenMP. On average, the execution time for our synthetic benchmarks is reduced by 50% from the simplest transformation when eight threads are used. The paper explores some possible enhancements to the Java threading API oriented towards improving the application–runtime interaction. Copyright © 2001 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
6. The Open Runtime Platform: a flexible high-performance managed runtime environment<FNR></FNR><FN>An earlier version of this paper was published online [1] </FN>.
- Author
-
Cierniak, Michal, Eng, Marsha, Glew, Neal, Lewis, Brian, and Stichnoth, James
- Subjects
GARBAGE collection (Computer science) ,COMPUTER memory management ,JAVA programming language ,COMPUTER programming ,ELECTRONIC data processing ,VIRTUAL machine systems - Abstract
The Open Runtime Platform (ORP) is a high-performance managed runtime environment (MRTE) that features exact generational garbage collection, fast thread synchronization, and multiple coexisting just-in-time compilers (JITs). ORP was designed for flexibility in order to support experiments in dynamic compilation, garbage collection, synchronization, and other technologies. It can be built to run either Java or Common Language Infrastructure (CLI) applications, to run under the Windows or Linux operating systems, and to run on the IA-32 or Itanium processor family (IPF) architectures. Achieving high performance in a MRTE presents many challenges, particularly when flexibility is a major goal. First, to enable the use of different garbage collectors and JITs, each component must be isolated from the rest of the environment through a well-defined software interface. Without careful attention, this isolation could easily harm performance. Second, MRTEs have correctness and safety requirements that traditional languages such as C++ lack. These requirements, including null pointer checks, array bounds checks, and type checks, impose additional runtime overhead. Finally, the dynamic nature of MRTEs makes some traditional compiler optimizations, such as devirtualization of method calls, more difficult to implement or more limited in applicability. To get full performance, JITs and the core virtual machine (VM) must cooperate to reduce or eliminate (where possible) these MRTE-specific overheads. In this paper, we describe the structure of ORP in detail, paying particular attention to how it supports flexibility while preserving high performance. We describe the interfaces between the garbage collector, the JIT, and the core VM; how these interfaces enable multiple garbage collectors and JITs without sacrificing performance; and how they allow the JIT and the core VM to reduce or eliminate MRTE-specific performance issues. Copyright © 2005 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
7. Preamble computation in automated test case generation using constraint logic programming<FNR></FNR><FN>A version of this paper was originally presented at SoftTest II: The Second U.K. Workshop on Software Testing Research, held at the University of York, U.K., 4–5 September 2003. It is reproduced here in modified form with the permission of the Workshop organizers </FN>
- Author
-
Colin, Séverine, Legeard, Bruno, and Peureux, Fabien
- Subjects
LOGIC programming ,COMPUTER science ,SOFTWARE engineering ,COMPUTER systems ,COMPUTER programming ,COMPUTER software industry - Abstract
BZ-Testing-Tools (BZ-TT) is a tool-set for automated model-based test case generation from B abstract machines and Z specifications. BZ-TT uses boundary testing as well as cause–effect testing on the basis of the formal model. It has been used and validated on several industrial case studies in the domain of critical software: in particular for smart card applications and automotive embedded systems. The main idea of BZ-TT is to compute a boundary goal for each effect of the operations of the model and then to compute a preamble sequence of operations to place the system under test in such a state that satisfies the goal. In this paper, the preamble computation search strategies used in BZ-TT are presented. More precisely, two algorithms based respectively on forward chaining and backward chaining are compared. These algorithms both use a customized set constraint solver, which is able to animate the formal model. These algorithms differ, however, in their capacity to reach the boundary goals efficiently. The results of applying the tools to an industrial windscreen wiper controller application are presented. Copyright © 2004 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
8. High-level data races<FN>An earlier version of this paper was originally presented at the First International Workshop on Verification and Validation of Enterprise Information Systems, held at Angers, France on 22 April 2003. It is reproduced here in modified form with the permission of the Workshop organizers and the publishers of the proceedings, ICEIS. </FN>
- Author
-
Artho, Cyrille, Havelund, Klaus, and Biere, Armin
- Subjects
DATA structures ,COMPUTER programming ,THREADS (Computer programs) ,SOFTWARE verification ,TESTING ,COMPUTER algorithms - Abstract
Data races are a common problem in concurrent and multi-threaded programming. Experience shows that the classical notion of a data race is not powerful enough to capture certain types of inconsistencies occurring in practice. This paper investigates data races on a higher abstraction layer. This enables detection of inconsistent uses of shared variables, even if no classical race condition occurs. For example, a data structure representing a coordinate pair may have to be treated atomically. By lifting the meaning of a data race to a higher level, such problems can now be covered. The paper defines the concepts ‘view’ and ‘view consistency’ to give a notation for this novel kind of property. It describes what kinds of errors can be detected with this new definition, and where its limitations are. It also gives a formal guideline for using data structures in a multi-threaded environment. © US Government copyright [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
9. Fair Evaluation of Orientation‐Averaging Techniques in Light‐Scattering Simulations: Comment on "Evaluation of Higher‐Order Quadrature Schemes in Improving Computational Efficiency for Orientation‐Averaged Single‐Scattering Properties of Nonspherical Ice Particles" by Fenni et al
- Author
-
Yurkin, Maxim A.
- Subjects
PARTICLE symmetries ,SIMULATION methods & models ,COMPUTER programming ,LIGHT scattering - Abstract
In a recent paper Fenni et al. (2021, https://doi.org/10.1029/2020jd034172) compared the code MIDAS, based on the direct solution of the volume‐integral equation combined with advanced cubatures for orientation averaging, to the code DDSCAT, a state‐of‐the‐art implementation of the discrete dipole approximation. This comment highlights methodological issues in this comparison and shows that the quantitative claims of Fenni et al. (2021, https://doi.org/10.1029/2020jd034172), related to superiority of MIDAS over DDSCAT, are based on very specific test cases with respect to particle symmetries or initial orientation, as well as to the selected scattering quantity of interest. Thus, these claims are not expected to hold for other similar particles. Moreover, the detailed discussion of these issues is relevant for all light‐scattering simulation methods, except those allowing analytical orientation averaging. Thus, the comment constructs general guidelines for fair evaluation of orientation‐averaging techniques in a wide range of light‐scattering methods and computer codes. Plain Language Summary: The paper discusses several issues that appear when one is comparing different orientation‐averaging techniques (cubatures) in combination with the same or different light‐scattering simulation methods. Fair evaluation of cubature performance in realistic general scenarios is important both for practitioners (to choose the most efficient combination of the existing codes and cubatures) and for code developers (to set their priorities on the new features with the largest expected benefits). Unfortunately, the performance of the cubatures is complexly interwoven with the internals of the simulation methods and depends on specific test particles and computed scattering quantities. This questions the generality of conclusions in some previous publications. Based on this discussion, the paper ends with general guidelines for fair evaluation of cubatures, allowing future studies to arrive at general conclusions, so that they can be directly used by other researchers. Key Points: Quantitative conclusions of Fenni et al. (2021) are based on very specific test casesOrientation‐averaging techniques should be compared on non‐symmetric particles, and not with a special initial orientationAny comparison of simulated results should consider their uncertainties accounting for all sources of errors [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. Computers and Qualitative Data Analysis: Paper, Pens, and Highlighters vs. Screen, Mouse, and Keyboard.
- Author
-
Duff, Patricia A. and Sror, Jérémie
- Subjects
COMPUTER software ,DATA analysis ,COMPUTER software development ,COMPUTER programming ,DATABASES ,INTEGRATED software - Abstract
The article analyzes the value of the computer-assisted qualitative data analysis software (CAQDAS) that was developed as an alternative to the traditional pen, paper, and scissors approach to handling complex data. CAQDAS not only does faster, more systematically, and more easily what can be done by hand, but also "does more with data" thanks to "a range of techniques and tools that were impossible, unknown or too time-consuming before computers entered the field." Indeed, from its earliest incarnations, CAQDAS took advantage of the computer's text retrieval and database capabilities. Researchers could search and separate data files into segments that they could tag or code for easy retrieval. Over the years, software became more sophisticated, adding functions that went beyond these simple code-and-retrieve procedures. Current software packages allow researchers to record memos of their developing ideas and to write up the research. The software also enables them to use various formats to visualize the analysis, including indexes, graphical displays and tables.
- Published
- 2005
11. Response to More comments on: A cohesion measure for object-oriented classes(The more extensive paper can be found at http://salmosa.kaist.ac.kr).
- Author
-
Heung-Seok Chae, Yong-Rae Kwon, and Doo-Hwan Bae
- Subjects
OBJECT-oriented methods (Computer science) ,OBJECT-oriented programming ,COMPUTER programming ,COMPUTER science ,COMPUTER algorithms - Abstract
The authors insist that monotonicity is a necessary property of a good cohesion metric and the violation of the monotonicity property limits the application of CBMC. They also state that the augmented CBMC can also be used as a guideline for quality evaluation and restructuring of poorly designed classes. This paper raises the question about the necessity of monotonicity by analyzing the reason that causes CBMC to violate the monotonicity property. In addition, we give a detailed description of the restructuring procedure based on CBMC. Copyright © 2003 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
12. Interface mutation<FN>A version of this paper was originally presented at Mutation 2000, a Symposium on Mutation Testing, held in San Jose, California, 6–7 October 2000. It is reproduced here in modified form with the permission of the Symposium organizers. </FN>
- Author
-
Ghosh, S. and Mathur, A. P.
- Subjects
COMPUTER software testing ,SOFTWARE verification ,COMPUTER programming ,ELECTRONIC data processing ,COMPUTER algorithms - Abstract
Applications that utilize a broker-based architecture are often composed of components that need to be tested individually and in combination. Furthermore, adequacy assessment of tests of components is useful in that it assists testers in identifying weaknesses in the tests generated so far and in offering hints on what the new tests must be. Traditional test adequacy criteria have limitations for commercial use, especially when tests for large components are to be assessed for their adequacy. This paper describes a test adequacy criterion based on interface mutation and a method, based on the criterion, to test components. This method requires the mutation of elements only from within a component's interface and not from within the code that implements the interface. The adequacy criterion based on interface mutation was evaluated empirically and compared with coverage criteria based on control flow for its relative effectiveness in revealing errors and in the cost incurred in developing test sets that satisfy the criterion. Copyright © 2001 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
13. Investigating the effectiveness of object-oriented testing strategies using the mutation method<FN>A version of this paper was originally presented at Mutation 2000, a Symposium on Mutation Testing, held in San Jose, California, 6–7 October 2000. It is reproduced here in modified form with the permission of the Symposium organizers. </FN>
- Author
-
Kim, Sun-Woo, Clark, John A., and McDermid, John A.
- Subjects
PROGRAMMING languages ,SYNTAX (Grammar) ,COMPUTER software ,COMPUTER programming ,ELECTRONIC data processing - Abstract
The mutation method assesses test quality by examining the ability of a test set to distinguish syntactic deviations representing specific types of faults from the program under test. This paper describes an empirical study performed to evaluate the effectiveness of object-oriented (OO) test strategies using the mutation method. The test sets for the experimental system are generated according to three selected OO test strategies and their effectiveness is compared by determining how well the developed test sets kill injected mutants derived from an established mutation system Mothra and the authors' own OO-specific mutation technique which is termed Class Mutation. Copyright © 2001 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
14. Unit and integration testing strategies for C programs using mutation<FN>A version of this paper was originally presented at Mutation 2000, a Symposium on Mutation Testing, held in San Jose, California, 6–7 October 2000. It is reproduced here in modified form with the permission of the Symposium organizers. </FN>
- Author
-
Vincenzi, A. M. R., Maldonado, J. C., Barbosa, E. F., and Delamaro, M. E.
- Subjects
PROGRAMMING languages ,COMPUTER software testing ,COMPUTER software ,COMPUTER programming ,COMPUTER systems ,ELECTRONIC data processing - Abstract
Mutation testing, originally proposed for unit testing, has been extended to integration testing with the proposition of the Interface Mutation criterion. This paper presents the results of an experiment using two mutation-based testing criteria for unit and integration testing phases: the Mutation Analysis and the Interface Mutation adequacy criteria, respectively. The aim is to investigate how they can be used in a complementary way during the testing activity, establishing an incremental testing strategy comprising the unit and integration testing phases and guidelines on how to obtain a high mutation score with respect to mutation testing with a low cost, in terms of the number of mutants generated. Copyright © 2001 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2001
- Full Text
- View/download PDF
15. Focus section on program debugging.
- Author
-
Tse, T. H.
- Subjects
COMPUTER debugging software ,COMPUTER software safety measures ,COMPUTER programming ,CONFERENCE papers - Abstract
The author focuses on program debugging, process of finding and removing bugs from a computer program and its internal mechanism. He informs that the special section on program debugging is taken from the articles presented in the 35th Annual International Computer Software and Applications Conference (COMPSAC 2011) held in Munich, Germany in July 2011. He states that the papers selected were related to the fault location statistically in computer programs.
- Published
- 2013
- Full Text
- View/download PDF
16. Combining computer vision and standardised protocols for improved measurement of live sea urchins for research and industry.
- Author
-
De Vos, Bas C., Cyrus, Mark D., Macey, Brett M., Batik, Theodore, and Bolton, John J.
- Subjects
COMPUTER vision ,SEA urchins ,MEASUREMENT errors ,COMPUTER programming ,CAMERA phones ,AQUACULTURE industry - Abstract
To allow sea urchin aquaculture to achieve its intended scale, efficient and precise methods for measuring large numbers of urchins in commercial‐scale operations are needed. Current protocols for measuring urchin test (shell) dimensions and mass are time‐consuming and prone to high measurement error, thus inconvenient in research and impractical in a commercial context. This study investigates and compares various measurement methods with a newly developed computer vision approach developed in this study, to establish a single protocol using precise, efficient and accessible methodology for measuring live urchins. We show that urchin wet mass can vary up to 8.73% depending on time out of water; this is significantly reduced to an average of 0.1% change by allowing urchins to drip‐dry for at least 90 s prior to weighing. We found the conventional vernier calliper method used to measure urchin dimensions to be both time‐consuming and imprecise (mean coefficient of variation (CV) of 2.41% for Tripneustes gratilla). Conversely, the computer vision programme we developed measures with higher precision (mean CV of 1.55% for T. gratilla) and is considerably faster. The software uses a series of hue saturation value filters, edge detection algorithms and distortions to measure the diameter of the test (excluding spines) of multiple urchins at once. The software is open‐source, and the protocol does not require specialised equipment (can be performed with a mobile phone camera). When the computer vision application is combined with the simple procedures described in this paper, to reduce measurement inaccuracies, urchin wet mass and diameter can be more efficiently and precisely determined. For a larger scale context, this software could easily be incorporated into various tools, such as a grading machine, to completely automate various farm processes. As such, this study has potential to assist urchin data collection in both research and commercial contexts. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
17. Davids versus Goliaths: Epigenetic dynamics and structural change in the Swedish innovation system.
- Author
-
Romano, Silvina A. and Zabala‐Iturriagagoitia, Jon Mikel
- Subjects
TECHNOLOGICAL innovations ,TECHNOLOGY transfer ,DISRUPTIVE innovations ,COMPUTER programming ,INDUSTRIALISM ,MARKET design & structure (Economics) ,EPIGENETICS ,STRUCTURAL dynamics ,STRUCTURAL health monitoring - Abstract
Entrepreneurship induces novelty in the innovation system, generates selection processes, and shapes market structure over time. Entrepreneurial firms participate in the creation and diffusion of new technologies, the creation, and early development of new markets and industries, and the renewal of existing firms and industries. But does entrepreneurship also facilitate the transformation and adaptability of innovation systems? In this paper, we analyze whether the Swedish innovation system's industrial structure underwent a transformation as a result of the emergence, growth, and development of new entrepreneurial firms during the last financial crisis. In particular, we focus on the dynamics of two relevant sectors of the Swedish economy which, however, show however different structural characteristics: manufacture of machinery and industrial equipment, and computer programming. Our analysis reveals that large companies use firm acquisitions to respond to the threats posed by new emerging entrepreneurial firms. In the current context of disruptive change, acquisitions are thus used as a defensive mechanism for firms' survival and adaptation rather than as a means to transfer/gain new knowledge/technology. We discuss how entrepreneurial ecosystems and innovation support policies can take two forms depending on the stage of development of an innovation system. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
18. Hack it with EDUCHIC! Educational hackathons and interdisciplinary challenges—Definitions, principles, and pedagogical guidelines.
- Author
-
Vanhée, Loïs, Danielsson, Karin, Enqvist, Lena, Grill, Kalle, and Borit, Melania
- Subjects
- *
HACKATHONS , *COMPUTER programming , *ACADEMIC achievement , *ORGANIZATIONAL learning , *KNOWLEDGE management - Abstract
Whereas hackathons are widespread within and outside academia and have been argued to be a valid pedagogical method for teaching interdisciplinarity, no detailed frameworks or methods are available for conceptualizing and organizing educational hackathons, i.e., hackathons dedicated to best achieving pedagogic objectives. This paper is dedicated to introducing EDUCational Hackathons for learning how to solve Interdisciplinary Challenges (EDUCHIC) through: (1) defining the fundamental principles for framing an activity as an EDUCHIC, integrating principles from pedagogical methods, hackathon organization, and interdisciplinarity processes; (2) describing general properties that EDUCHIC possess as a consequence of the interaction of the fundamental principles; (3) developing operational guidelines for streamlining the practical organization of EDUCHIC, including an exhaustive end‐to‐end process covering all the steps for organizing EDUCHIC and practical frames for carrying the key decisions to be made in this process; and (4) a demonstration of these guidelines through illustrating their application for organizing a concrete EDUCHIC. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. Programming as a language for young children to express and explore mathematics in school.
- Author
-
Goldenberg, E. Paul and Carter, Cynthia J.
- Subjects
MATHEMATICS education ,COMPUTER programming ,ACTIVE learning ,SCHOOL children ,PRIMARY education - Abstract
Natural language helps express mathematical thinking and contexts. Conventional mathematical notation (CMN) best suits expressions and equations. Each is essential; each also has limitations, especially for learners. Our research studies how programming can be a advantageous third language that can also help restore mathematical connections that are hidden by topic‐centred curricula. Restoring opportunities for surprise and delight reclaims mathematics' creative nature. Studies of children's use of language in mathematics and their programming behaviours guide our iterative design/redesign of mathematical microworlds in which students, ages 7–11, use programming in their regular school lessons as a language for learning mathematics. Though driven by mathematics, not coding, the microworlds develop the programming over time so that it continues to support children's developing mathematical ideas. This paper briefly describes microworlds EDC has tested with well over 400 7‐to‐8‐year‐olds in school, and others tested (or about to be tested) with over 200 8‐to‐11‐year‐olds. Our challenge was to satisfy schools' topical orientation and fit easily within regular classroom study but use and foreshadow other mathematical learning to remove the siloes. The design/redesign research and evaluation is exploratory, without formal methodology. We are also more formally studying effects on children's learning. That ongoing study is not reported here. Practitioner notesWhat is already known Active learning—doing—supports learning.Collaborative learning—doing together—supports learning.Classroom discourse—focused, relevant discussion, not just listening—supports learning.Clear articulation of one's thinking, even just to oneself, helps develop that thinking.What this paper adds The common languages we use for classroom mathematics—natural language for conveying the meaning and context of mathematical situations and for explaining our reasoning; and the formal (written) language of conventional mathematical notation, the symbols we use in mathematical expressions and equations—are both essential but each presents hurdles that necessitate the other. Yet, even together, they are insufficient especially for young learners.Programming, appropriately designed and used, can be the third language that both reduces barriers and provides the missing expressive and creative capabilities children need.Appropriate design for use in regular mathematics classrooms requires making key mathematical content obvious, strong and the 'driver' of the activities, and requires reducing tech 'overhead' to near zero.Continued usefulness across the grades requires developing children's sophistication and knowledge with the language; the powerful ways that children rapidly acquire facility with (natural) language provides guidance for ways they can learn a formal language as well.Implications for policy and/or practice Mathematics teaching can take advantage of the ways children learn through experimentation and attention to the results, and of the ways children use their language brain even for mathematics.In particular, programming—in microworlds driven by the mathematical content, designed to minimise distraction and overhead, open to exploration and discovery en route to focused aims, and in which children self‐evaluate—can allow clear articulation of thought, experimentation with immediate feedback.As it aids the mathematics, it also builds computational thinking and satisfies schools' increasing concerns to broaden access to ideas of computer science. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
20. GOOGLE'S Duplex: Pretending to be human.
- Author
-
O'Leary, Daniel E.
- Subjects
TURING test ,CODES of ethics ,SOCIAL processes ,COMPUTER programming ,ETHICS - Abstract
SUMMARY: Google's Duplex is a computer‐based system with natural language capabilities that provides a human sounding conversation as it performs a set of tasks, such as making restaurant reservations. This paper analyses Google's Duplex and some of the initial reaction to the system and its capabilities. The paper does a text analysis and finds that the system‐generated text creates standardized ratings that suggest the text is analytical, authentic and possesses a generally positive tone. As would be expected for the applications for which it is being used, the text is heavily focused on the present. In addition, this analysis indicates that the text provides evidence of social processes, cognitive processes, tentativeness and affiliation. Further, this paper examines some of the characteristics of speech that Duplex uses to sound human. Those capabilities appear to allow the system pass the Turing test for some well‐structured tasks. However, this paper investigates some of the ethics of pretending to be human and suggests that such impersonation is against evolving computer codes of ethics. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
21. Resource provisioning in Science Clouds: Requirements and challenges.
- Author
-
López García, Álvaro, Fernández‐del‐Castillo, Enol, Orviz Fernández, Pablo, Campos Plasencia, Isabel, and Marco de Lucas, Jesús
- Subjects
CLOUD computing ,INFORMATION technology ,COMPUTER programming ,COMPUTER systems ,APPLICATION software - Abstract
Summary: Cloud computing has permeated into the information technology industry in the last few years, and it is emerging nowadays in scientific environments. Science user communities are demanding a broad range of computing power to satisfy the needs of high‐performance applications, such as local clusters, high‐performance computing systems, and computing grids. Different workloads are needed from different computational models, and the cloud is already considered as a promising paradigm. The scheduling and allocation of resources is always a challenging matter in any form of computation and clouds are not an exception. Science applications have unique features that differentiate their workloads; hence, their requirements have to be taken into consideration to be fulfilled when building a Science Cloud. This paper will discuss what are the main scheduling and resource allocation challenges for any Infrastructure as a Service provider supporting scientific applications. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
22. Industrial experiences from evolving measurement systems into self‐healing systems for improved availability.
- Author
-
Staron, Miroslaw, Meding, Wilhelm, Tichy, Matthias, Bjurhede, Jonas, Giese, Holger, and Söder, Ola
- Subjects
COMPUTER software development ,COMPUTER software industry ,COMPUTER programming ,COMPUTER software ,COMPUTER science - Abstract
Summary: Automated measurement programs are an efficient way of collecting, processing, and visualizing measures in large software development companies. The number of measurements in these programs is usually large, which is caused by a diversity of the needs of the stakeholders. In this paper, we present the application of the self‐healing concepts to assure the availability of measurements to the stakeholders without the need for effort‐intensive and costly manual interventions of the operators. We study the measurement infrastructure at one of the development units of a large infrastructure provider. In this paper, we present how the Monitor, Analyze, Plane, and Execute with Knowledge model was instantiated in a simplistic manner to reduce the need for manual intervention in the operation of the measurement systems. Based on the experiences from the 2 cases studied in this paper, we show how an evolution toward self‐healing measurement systems is done both with a dedicated failure taxonomy and with an effective straightforward handling of the most common errors in the execution. The mechanisms studied and presented in this paper show that self‐healing provides significant improvements to the operation of the measurement program and reduces the need for daily oversight by an operator for the measurement systems. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
23. Using multimodal learning analytics to understand effects of block‐based and text‐based modalities on computer programming.
- Author
-
Sun, Dan, Ouyang, Fan, Li, Yan, Zhu, Chengcong, and Zhou, Yang
- Subjects
- *
HIGH schools , *COMPUTER software , *RESEARCH funding , *DATA analytics , *RESEARCH methodology , *LEARNING strategies , *PROGRAMMED instruction - Abstract
Background: With the development of computational literacy, there has been a surge in both research and practice application of text‐based and block‐based modalities within the field of computer programming education. Despite this trend, little work has actually examined how learners engaging in programming process when utilizing these two major programming modalities, especially in the context of secondary education settings. Objectives: To further compare programming effects between and within text‐based and block‐based modalities, this research conducted a quasi‐experimental research in China's secondary school. Methods: An online programming platform, Code4all, was developed to allow learners to program in text‐based and block‐based modalities. This research collected multimodal data sources, including programming platform data, process data, and performance data. This research further utilized multiple learning analytics approaches (i.e., clustering analysis, click stream analysis, lag‐sequential analysis and statistics) to compare learners' programming features, behavioural patterns and knowledge gains under two modalities. Results and Conclusions: The results indicated that learners in text‐based modality tended to write longer lines of code, encountered more syntactical errors, and took longer to attempt debugging. In contrast, learners in block‐based modality spent more time operating blocks and attempt debugging, achieving better programming knowledge performances compared to their counterparts. Further analysis of five clusters from the two modalities revealed discrepancies in programming behavioural patterns. Implications: Three major pedagogical implications were proposed based on empirical research results. Furthermore, this research contributed to the learning analytics literature by integrating process‐oriented and summative analysis to reveal learners' programming learning quality. Lay Description: What is currently known about the subject matter: Programming has the potential to improve learners' higher‐order thinking skills.Block‐based and text‐based modalities are two major instructional methods.There has been a growing interest to understand how learning occurs in two modes.Most previous work has evaluated two modalities based on learners' knowledge, skills, and attitudes. What this paper adds: Code4all allows learners to programming in text‐based and block‐based modalities.Quasi‐experimental research was conducted to examine block‐based and text‐based programming modalities.Multimodal learning analytics were used to compare programming under two modalities.Learners' programming features, behaviouralbehavioral patterns, and knowledge gains were identified under two modalities. The implications of study findings for practitioners: Instructors should integrate text‐based and block‐based modalities into programming courses.Process‐oriented assessment should be integrated with summative assessment. Adaptive, timely scaffoldings should be provided with the external support (this should be marked like above two points). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. An architecture refactoring approach to reducing software hierarchy complexity.
- Author
-
Zhao, Yongxin, Wu, Wenhan, Fei, Yuan, Liu, Zhihao, Li, Yang, Yang, Yilong, Shi, Ling, and Zhang, Bo
- Subjects
- *
SOFTWARE refactoring , *COMPUTER software correctness , *COMPUTER software quality control , *SOFTWARE architecture , *COMPUTER programming , *BATTERY management systems - Abstract
Summary: Software complexity is the very essence of computer programming. As the complexity increases, the potential risks and defects of software systems will increase. This makes the software correctness analysis and the software quality improvement more difficult. In this paper, we present a quantitative metric to describe the complexity of a hierarchical software and a Complexity‐oriented Software Architecture Refactoring (CoSSR) approach to reduce the complexity. The main idea is to identify and then reassemble subcomponents into one hierarchical component, which achieves minimum complexity in terms of the solution algorithm. Moreover, our algorithm can be improved by introducing partition constraint, heuristic search strategy, and spectral clustering. We implement the proposed method as an automated refactoring tool and demonstrate our algorithm through a case study of battery management system (BMS). The results show that our approach is more efficient and effective to reduce the complexity of hierarchical software system. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. A computational model of bounded developable surfaces with application to image-based three-dimensional reconstruction.
- Author
-
Perriollat, Mathieu and Bartoli, Adrien
- Subjects
APPLICATION software ,IMAGE reconstruction ,IMAGE processing ,COMPUTER graphics ,COMPUTER programming ,COMPUTER vision ,THREE-dimensional imaging - Abstract
ABSTRACT Developable surfaces have been extensively studied in computer graphics because they are involved in a large body of applications. This type of surfaces has also been used in computer vision and document processing in the context of three-dimensional (3D) reconstruction for book digitization and augmented reality. Indeed, the shape of a smoothly deformed piece of paper can be very well modeled by a developable surface. Most of the existing developable surface parameterizations do not handle boundaries or are driven by overly large parameter sets. These two characteristics become issues in the context of developable surface reconstruction from real observations. Our main contribution is a generative model of bounded developable surfaces that solves these two issues. Our model is governed by intuitive parameters whose number depends on the actual deformation and including the 'flat shape boundary'. A vast majority of the existing image-based paper 3D reconstruction methods either require a tightly controlled environment or restricts the set of possible deformations. We propose an algorithm for reconstructing our model's parameters from a general smooth 3D surface interpolating a sparse cloud of 3D points. The latter is assumed to be reconstructed from images of a static piece of paper or any other developable surface. Our 3D reconstruction method is well adapted to the use of keypoint matches over multiple images. In this context, the initial 3D point cloud is reconstructed by structure-from-motion for which mature and reliable algorithms now exist and the thin-plate spline is used as a general smooth surface model. After initialization, our model's parameters are refined with model-based bundle adjustment. We experimentally validated our model and 3D reconstruction algorithm for shape capture and augmented reality on seven real datasets. The first six datasets consist of multiple images or videos and a sparse set of 3D points obtained by structure-from-motion. The last dataset is a dense 3D point cloud acquired by structured light. Our implementation has been made publicly available on the authors' web home pages. Copyright © 2012 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
26. Reliability‐based calibration of design code formulas: Application to shear resistance formulas for reinforced concrete members without shear reinforcement.
- Author
-
Slobbe, Arthur, Rózsás, Árpád, and Yang, Yuguang
- Subjects
SHEAR reinforcements ,REINFORCED concrete ,COMPUTER programming ,REINFORCED concrete testing ,CALIBRATION - Abstract
This paper presents a reliability‐based calibration method for design code formulas. The method is demonstrated on the shear design formulas in Eurocode 2 and fib Model Code 2010 (MC2010). We found that the partial factor γc in the current Eurocode 2 is about 20% lower than the optimal value and, thus, provides an insufficient safety margin. The obtained optimal partial factor γR in the (modified) Eurocode 2 and MC2010 formulas is 1.53 and 1.36, respectively. The difference stems from higher accuracy and, hence, lower uncertainty of the MC2010 model in predicting experimental results. Hence, on average, the MC2010 formula leads to about 13% larger design resistances compared to Eurocode 2 given that the target reliability for both design formulas is the same. To stimulate and facilitate future structural code development and derivation of partial factors, we make the used computer code freely available. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
27. The effect of automatic assessment on novice programming: Strengths and limitations of existing systems.
- Author
-
Ullah, Zahid, Lajis, Adidah, Jamjoom, Mona, Altalhi, Abdulrahman, Al‐Ghamdi, Abdullah, and Saleem, Farrukh
- Subjects
FLIPPED classrooms ,TEACHING ,PROGRAMMING languages ,COMPUTER programming - Abstract
Computer programming is always of high concern for students in introductory programming courses. High rates of failure occur every semester due to lack of adequate skills in programming. No student can become a programmer overnight because such learning requires proper guidance as well as consistent practice with the programming exercises. The role of instructors in the development of students' learning skills is crucial in order to provide feedback on their errors and improve their knowledge accordingly. On the other hand, due to the large number of students, instructors are also overloading themselves to focus on each individual student's errors. To address these issues, researchers have developed numerous Automatic Assessment (AA) systems that not only evaluate the students' programs but also provide instant feedback on their errors as well as abridge the workload of the instructors. Due to the large pool of existing systems, it is difficult to cover each and every system in one study. Therefore, this paper provides a comprehensive overview of some of the existing systems based on the three‐analysis approaches: dynamic, static, and hybrid. Moreover, this paper aims to discuss the strengths and limitations of these systems and suggests some potential recommendations regarding the AA specifications for novice programming, which may help in standardizing these systems. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
28. Code the mime: A 3D programmable charades game for computational thinking in MaLT2.
- Author
-
Grizioti, Marianthi and Kynigos, Chronis
- Subjects
COMPUTER science ,EDUCATIONAL games ,GEOMETRY ,COMPUTER programming ,LEARNING theories in education - Abstract
In this paper, we discuss the need for new approaches to research regarding coding to support students in developing practices in computational thinking, such as abstraction and decomposition, in multidisciplinary contexts. We explore students' activities with a tool integrating constructionist textual programming activity with game‐based learning and specifically game modding. In this context, we designed a programmable 'design‐to‐play' game developed with the computational environment MaLT2. MaLT2 offers the affordances of textual programming, dynamic manipulation, and 3D navigation for the design of 3D animated models aiming to give children access to, otherwise, complex, computational and mathematical ideas. To develop an understanding of children's learning activity regarding computational practices, we organised an empirical study with middle‐school students, who played a game called 'Code‐the‐Mime'. It is a charades‐based game in which the players manipulate, programme, and modify a digital human model to describe a word to their teammates. The preliminary findings indicate that the affordances of MaLT2 in conjunction with the game context enabled students to express and develop key computational practices, including decomposition, pattern recognition, analysis and abstraction, in a meaningful and multidisciplinary context. Practitioner notesWhat is already known about this topic Computational Thinking is considered a key 21st‐century skill in preparing the young to become digital citizens. It involves concepts and practices that can be used to solve problems computationally across multiple fields. However, there is still limited knowledge of how students develop computational practices, such as abstraction, pattern recognition, decomposition, and how they may express and apply them in diverse contexts. Students' engagement with computational practices is unlikely to be supported either by closed, simplified coding tasks or higher‐level advanced programming exercises. There is a need to clarify the manifestation of these practices and how they can be realised and expressed and used by learners in meaningful and transdisciplinary contexts.What this paper adds It suggests the design of constructionist computational games that integrate design and programming into the gameplay, aiming to engage students with computational practices in a multidisciplinary, authentic context. It provides an example of a 'design‐to‐play' charades‐like game, developed in a 3D modelling programming environment, that embeds real‐life representations into computational design, to enable 'syntonic learning' of computational practices. Furthermore, it analyses student learning activity to elaborate on arguments and issues related to this approach.Implications for practice and/or policy There is added value in disconnecting computational thinking from positivist diagnostic approaches related to respective concepts and studying it in ways more related to realistic problem‐solving situations and multidisciplinary contexts. The study contributes to the scientific clarification of computational practices concerning how they are being realised and expressed by the students in different contexts through an original example of educational practice. The discussed approach and tools can contribute to the design and development of innovative digital media, embedding affordances for concepts and practices while maintaining relevance and interest for their users. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
29. Threshold concepts, conceptions and skills: Teachers' experiences with students' engagement in functions.
- Author
-
Kallia, Maria and Sentance, Sue
- Subjects
COMPUTER software ,TEACHING methods ,RESEARCH methodology ,COLLEGE teacher attitudes ,INTERVIEWING ,PHENOMENOLOGY ,LEARNING strategies ,JUDGMENT sampling ,DELPHI method - Abstract
Threshold concepts have been characterised in the literature as jewels in the curriculum as they can inform teaching and learning practices. Therefore, identifying and addressing threshold concepts in any discipline is critical. The aim of the current study is to explore the existence of threshold concepts in computer programming and specifically with regard to the area of functions. Based on our previous works in which we identified 11 potential threshold concepts in functions by employing the Delphi method and seven misconceptions that students hold in this area of programming, the current study further explores computing teachers' experiences with students' engagement with 4 of the 11 concepts using an interpretative phenomenological analysis of interviews. The analysis revealed that from these concepts, we could argue that parameters, parameter passing and return values likely form a threshold conception and procedural decomposition is a procedural threshold (threshold skill). The study presents our framework that lead us to the identification of these thresholds in computer programming, presents the computing teachers experiences with these concepts and concludes with the implication of these results on students' learning and teaching practices in computer programming. Lay Description: What is already known about this topic: Threshold concepts are concepts that are particularly troublesome for students basically because of their transformative and integrative characteristics.Students struggle to understand these concepts and therefore learning becomes a barrier.In computer programming, research suggests that concepts like object‐oriented programming, recursion, pointers are examples of threshold concepts, but more research is needed to identify more threshold concepts and how students experience these concepts. What this paper adds: The paper explores teachers' experiences teaching the concepts: parameters, parameter passing, return values and procedural decomposition and identifies common students' learning difficulties with these concepts.The paper investigates students' transformations and integration of knowledge once they have understood these concepts through the teachers' reflective practice.The paper suggests that parameters, parameter passing and return values form a threshold conception in programming and that procedural decomposition is a procedural threshold. Implications for practice and/or policy: The paper highlights many problematic and troublesome areas in functions in which students experience many difficulties. This specific output can be used by teachers who would like to organise their lessons around difficult points in functions and prepare appropriate materials that will help students overcome the corresponding obstacles.The paper identifies epistemological and ontological transformations that students experience once they overcome the conceptual difficulties and finally grasp the threshold concepts. This outcome has a significant impact on the teaching of computer programming. Teachers should take into consideration the powerful conceptual relationships these concepts form and the transformations that students' exhibit by creating learning environments that will bring about these transformations and assist students in their attempt to understand and identify themselves in the computer classroom. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
30. Introduction to the special section: Teaching critical GIS | Teaching GIS critically.
- Author
-
Warren, Stacy
- Subjects
COMPUTER programming ,TEACHING - Abstract
Even more contentious is the question of what teaching critical GIS entails, echoing questions raised by Elwood and Wilson (2017) who observe that teaching critical GIS and teaching GIS critically are not necessarily the same thing. Together, they represent a candid and realistic examination of how the contours of the medium, be it open source GIS, programming code, or artistic materials, can be leveraged to create critical GIS curricular experience that remains relevant to the challenges around us. Recognizing that these far broader changes have impacted how we teach, how students learn, and what it means to interact with GIS software, we set out to contextualize Anna's words in a reflective manner that situates her experiences in longer GIS trajectories. The innovative critical GIS master's program that Anna helped shape back in 2015, for example, experienced the slow-burning financial squeeze described in our paper that necessitated subsequent waves of nimble dodging and trimming. [Extracted from the article]
- Published
- 2020
- Full Text
- View/download PDF
31. Compilation of MATLAB computations to CPU/GPU via C/OpenCL generation.
- Author
-
Reis, Luís, Bispo, João, and Cardoso, João M. P.
- Subjects
COMPUTER architecture ,COMPUTER programming ,PROGRAMMING languages ,COMPUTING platforms ,COMPUTER performance ,GRAPHICS processing units ,COMPILERS (Computer programs) - Abstract
Summary: In order to take advantage of the processing power of current computing platforms, programmers typically need to develop software versions for different target devices. This task is time‐consuming and requires significant programming and computer architecture expertise. A possible and more convenient alternative is to start with a single high‐level description of a program with minimum implementation details, and generate custom implementations according to the target platform. In this paper, we use MATLAB as a high‐level programming language and propose a compiler that targets CPU/GPU computing platforms by generating customized implementations in C and OpenCL. We propose a number of compiler techniques to automatically generate efficient C and OpenCL code from MATLAB programs. One of such compiler techniques relies on heuristics to decide when and how to use Shared Virtual Memory (SVM). The experimental results show that our approach is able to generate code that provides significant speedups (eg, geometric mean speedup of 11× for a set of simple benchmarks) using a discrete GPU over equivalent sequential C code executing on a CPU. With more complex benchmarks, for which only some code regions can be parallelized, and are thus offloaded, the generated code achieved speedups of up to 2.2×. We also show the impact of using SVM, specifically fine‐grained buffers, and the results show that the compiler is able to achieve significant speedups, both over the versions without SVM and with naïve aggressive SVM use, across three CPU/GPU platforms. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
32. Challenges in the development of green and sustainable software for software multisourcing vendors: Findings from a systematic literature review and industrial survey.
- Author
-
Salam, Muhammad and Khan, Siffat Ullah
- Subjects
SUSTAINABILITY ,COMPUTER software development ,ELECTRIC power consumption ,EMISSION control ,COMPUTER programming - Abstract
Abstract: Green and sustainable software development has emerged in recent years, and vendors are constantly striving to develop software that has a less hazardous impact on the environment, economy, and human beings. However, developing green software in the context of software multisourcing is not a risk‐free activity, and vendors are exposed to several challenges. The authors have conducted both a systematic literature review and an industrial survey to identify the challenges faced by multisourcing vendors in the development of green and sustainable software. The final publication sample for the systematic literature review is composed of 54 research papers. Similarly, the final sample of the industrial survey is composed of 108 relevant experts. The authors have identified a list of 14 challenges. Of these challenges/risk factors, 8 have been tagged as critical risk factors. These critical risk factors are “lack of green RE practices,” “high power consumption,” “high carbon emission throughout the software development,” “poor software design (architectural, logical, physical, and user interface),” “lack of ICTs for coordination and communication,” “high resource requirements,” “lack of coding standards,” and “lack of green software development knowledge.” [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
33. Query expansion based on statistical learning from code changes.
- Author
-
Huang, Qing, Yang, Yangrui, Zhan, Xue, Wan, Hongyan, and Wu, Guoqing
- Subjects
INFORMATION retrieval ,STATISTICAL learning ,SEARCH algorithms ,QUERY (Information retrieval system) ,COMPUTER programming - Abstract
Summary: Thesaurus‐based, code‐related, and software‐specific query expansion techniques are the main contributions in free‐form query search. However, these techniques still could not put the most relevant query result in the first position because they lack the ability to infer the expansion words that represent the user needs based on a given query. In this paper, we discover that code changes can imply what users want and propose a novel query expansion technique with code changes (QECC). It exploits (changes, contexts) pairs from changed methods. On the basis of statistical learning from pairs, it can infer code changes for a given query. In this way, it expands a query with code changes and recommends the query results that meet actual needs perfectly. In addition, we implement InstaRec to perform QECC and evaluate it with 195 039 change commits from GitHub and our code tracker. The results show that QECC can improve the precision of 3 code search algorithms (ie, IR, Portfolio, and VF) by up to 52% to 62% and outperform the state‐of‐the‐art query expansion techniques (ie, query expansion based on crowd knowledge and CodeHow) by 13% to 16% when the top 1 result is inspected. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
34. Modified covariance intersection for data fusion in distributed nonhomogeneous monitoring systems network.
- Author
-
Daeichian, Abolghasem and Honarvar, Elham
- Subjects
DATA fusion (Statistics) ,REMOTE sensing ,AUTOREGRESSIVE models ,COMPUTER simulation ,COMPUTER programming - Abstract
Summary: Monitoring networks contain monitoring nodes that observe an area of interest to detect any possible existing object and estimate its states. Each node has characteristics such as probability of detection and clutter density that may have different values for distinct nodes in nonhomogeneous monitoring networks. This paper proposes a modified covariance intersection method for data fusion in such networks. It is derived by formulating a mixed game model between neighbor monitoring nodes as players and considering the inverse of the trace of fused covariance matrix as players' utility function. Monitoring nodes estimate the states of any possible existing object by applying joint target detection and tracking filter on their own observations. Processing nodes fuse the estimated states received from neighbor monitoring nodes by the proposed modified covariance intersection. It is validated by simulating target detection and tracking problem in 2 situations: 1 target and unknown number of targets. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
35. So What Is in an Earth System Model?
- Author
-
Jones, C. D.
- Subjects
CLIMATOLOGY ,EARTH system science ,WEATHER forecasting ,SIMULATION methods & models ,COMPUTER programming ,SCIENTIFIC community - Abstract
Use of numerical models is at the heart of climate science. Models underpin process understanding; our ability to explain observed past changes and subsequently making predictions/projections of the coming decades to help guide climate policy. Roland Séférian and colleagues (2019, https://doi.org/10.1029/2019MS001791) have written an excellent paper documenting CNRM‐CERFACS's latest complex model, "CNRM‐ESM2‐1," which will be used for CMIP6 simulations and will produce data to be analyzed by the research community for many years to come. Here I explain the importance of Seferian et al. and why it showcases a new degree of comprehensiveness needed for model documentation activity. Plain Language Summary: Just like weather forecasting, climate science relies heavily on simulating the Earth's weather patterns using complicated numerical models on very powerful supercomputers. These models often represent tens or hundreds of thousands of lines of computer code and encompass as much understanding of how the real world operates as we can include into them. Their complexity means that it is very important to document what is in them and how they behave. In this commentary I explain why Roland Séférian and colleagues (2019, https://doi.org/10.1029/2019MS001791) do a particularly good job of explaining and documenting the developments in their new model: "CNRM‐ESM2‐1," Key Points: This commentary describes Seferian et al. documentation of CNRM's new Earth System ModelSeferian et al represents an example of model documentation and traceability for others to follow [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
36. Students' interest in Scratch coding in lower secondary mathematics.
- Author
-
Dohn, Niels Bonderup
- Subjects
MATHEMATICS education (Secondary) ,COMPUTER programming ,STUDENT interests ,LITERACY ,EDUCATIONAL technology ,SIXTH grade (Education) - Abstract
The ability to code computer programs is considered an important part of literacy in today's society. This paper reports from a case study in two sixth‐grade classes where Scratch coding was part of six mathematics lessons. The aim of the study was to investigate how Scratch coding affected students' interest development in coding and in mathematics. Data were collected using a convergent parallel mixed methods design. The results show a slight, but nevertheless significant, negative effect on students' average interest in coding, as well as in mathematics. Students attributed this to the level of difficulty and the tedious workflow, indicating that their waning interest was due to the prescriptive nature of tasks that offered neither a sense of accomplishment nor the chance for autonomous input. However, situational interest was triggered in off‐task coding situations. These situations were not related to the mathematical coding tasks but to the use of existing Scratch games and animations. The findings point to the importance of design principles that allow students an opportunity to tinker, but also to a need for an increased focus on facilitating the development of design knowledge within teacher professional development. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
37. Revising the computer programming attitude scale in the context of attitude ambivalence.
- Author
-
Yusuf, Abdullahi and Noor, Norah Md
- Subjects
- *
EXPERIMENTAL design , *RESEARCH , *ATTITUDES toward computers , *ACADEMIC medical centers , *CONFIDENCE intervals , *RESEARCH methodology evaluation , *RESEARCH methodology , *SOFTWARE architecture , *MULTITRAIT multimethod techniques , *RESEARCH funding , *FACTOR analysis , *DESCRIPTIVE statistics , *STUDENT attitudes , *EMPIRICAL research , *ODDS ratio , *LOGISTIC regression analysis - Abstract
Background: Several attitude scales have been developed to measure students' attitudes toward computer programming, including the prominent one developed by Cetin and Ozden. The development of these scales stemmed from the elusive nature of attitude and the lack of specific constructs to measure attitude. These instruments measure students' attitudes based on one‐dimensional perspective, thus, making it difficult to interpret the meaning of some attitude evaluations such as the meaning of neutral points in a 10‐point scale (for example). Objectives: The computer programming attitude scale was modified to measure ambivalence. The study also investigate attitude differences across demographic variables and used these variables to predict ambivalence. Methods: The study was conducted in two phases. In the first phase, the instrument was validated using exploratory factor analysis and confirmatory factor analysis. In the second phase, the revised scale was administered to another 547 students in four research universities for empirical investigation. Results: Results show that the instrument is valid and suitable for measuring students' programming attitudes. Participants' attitudes skewed toward the negative attitude dimension. Lastly, we found that both attitude and ambivalence are factors of programming experience. Conclusions: We discussed the findings, recommend the instrument to programming tutors, and strongly emphasise the evaluation of students' ambivalent attitudes. Lay Description: What is currently known: There are numerous scales developed to measure students' attitude toward computer programming, such as the computer programming attitude scale (CPAS).The development of these inventories stemmed from the elusive nature of attitude and the lack of specific construct to measure attitude.These instruments measure students' attitude based on one dimensional perspective, thus, making it difficult to interpret the meaning of some attitude evaluations such as the meaning of neutral points in a 10‐point scale (for example). What this paper adds: The present research brings a novel approach to instrument development by acknowledging the presence of attitude toward computer programming.This novel approach advances the knowledge of instrument development in the field of attitude measurement by introducing a paradigm shift from conventional attitude evaluation to a more diverse approach.The study has introduced new items to the existing CPAS, and subsequently, indicates the possibility of validating an attitude inventory using data obtained from ambivalence evaluation.By using this instrument, students' attitude is measured accurately rather than adopting the conventional measures that ignored the presence of negative and positive evaluations. Implication to practitioners: Through the findings of the study, computer science and programming educators would come to know that many positive and negative attitude elements are occasionally subsumed toward an attitude object, thereby leading to an ambivalence attitude.The instrument can be used by the practitioners to accurately measure students' positive and negative attitude toward programming across all levels of education.Using the instrument, practitioners can predict the identified attitude using wide range of variables such as gender, age, computational thinking skills, programming proficiency levels, among others. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. Machines and metaphors: Challenges for the detection, interpretation and production of metaphors by computer programs.
- Author
-
Hesse, Jacob
- Subjects
METAPHOR ,COMPUTER programming ,NATURAL language processing ,PHILOSOPHY of language ,PRAGMATICS - Abstract
Powerful transformer models based on neural networks such as GPT‐4 have enabled huge progress in natural language processing. This paper identifies three challenges for computer programs dealing with metaphors. First, the phenomenon of Twice‐Apt‐Metaphors shows that metaphorical interpretations do not have to be triggered by syntactical, semantic or pragmatic tensions. The detection of these metaphors seems to involve a sense of aesthetic pleasure or a higher‐order theory of mind, both of which are difficult to implement into computer programs. Second, the contexts relative to which metaphors are interpreted are not simply given but must be reconstructed based on pragmatic considerations that can involve presuppositional pretence. If computer programs cannot produce or understand such a form of pretence, they will have problems dealing with certain metaphors. Finally, adequately interpreting and reacting to some metaphors seems to require the ability to have internal, first‐personal experiential and affective states. Since it is questionable whether computer programs have such mental states, it can be assumed that they will have problems with these kinds of metaphors. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. Direct pseudo-spectral method for optimal control of obstacle problem - an optimal control problem governed by elliptic variational inequality.
- Author
-
Khaksar‐e Oshagh, M. and Shamsi, M.
- Subjects
OPTIMAL control theory ,MATHEMATICAL programming ,FUNCTIONAL equations ,CONVEX programming ,COMPUTER programming - Abstract
In this paper, a computational technique based on the pseudo-spectral method is presented for the solution of the optimal control problem constrained with elliptic variational inequality. In fact, our aim in this paper is to present a direct approach for this class of optimal control problems. By using the pseudo-spectral method, the infinite dimensional mathematical programming with equilibrium constraint, which can be an equivalent form of the considered problem, is converted to a finite dimensional mathematical programming with complementarity constraint. Then, the finite dimensional problem can be solved by the well-developed methods. Finally, numerical examples are presented to show the validity and efficiency of the technique. Copyright © 2017 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
40. Prioritizing manual test cases in rapid release environments.
- Author
-
Hemmati, Hadi, Fang, Zhihan, Mäntylä, Mika V., and Adams, Bram
- Subjects
SCALES (Weighing instruments) ,SOURCE code ,COMPUTER programming ,EMPIRICAL research ,EQUIPMENT & supplies - Abstract
Test case prioritization is an important testing activity, in practice, specially for large scale systems. The goal is to rank the existing test cases in a way that they detect faults as soon as possible, so that any partial execution of the test suite detects the maximum number of defects for the given budget. Test prioritization becomes even more important when the test execution is time consuming, for example, manual system tests versus automated unit tests. Most existing test case prioritization techniques are based on code coverage, which requires access to source code. However, manual testing is mainly performed in a black-box manner (manual testers do not have access to the source code). Therefore, in this paper, the existing test case prioritization techniques (e.g. diversity-based and history-based techniques) are examined and modified to be applicable on manual black-box system testing. An empirical study on four older releases of desktop Firefox showed that none of the techniques were strongly dominating the others in all releases. However, when nine more recent releases of desktop Firefox, where the development has been moved from a traditional to a more agile and rapid release environment, were studied, a very significant difference between the history-based approach and its alternatives was observed. The higher effectiveness of the history-based approach compared with alternatives also held on 28 additional rapid releases of other Firefox projects - mobile Firefox and tablet Firefox. The conclusion of the paper is that test cases in rapid release environments can be very effectively prioritized for execution, based on their historical failure knowledge. In particular, it is the recency of historical knowledge that explains its effectiveness in rapid release environments rather than other changes in the process. Copyright © 2016 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
41. Self-modeling based diagnosis of network services over programmable networks.
- Author
-
Sánchez Vílchez, José Manuel, Ben Yahia, Imen Grida, Lac, Chidung, and Crespi, Noel
- Subjects
SELF diagnosis ,SOFTWARE-defined networking ,VIRTUAL machine systems ,COMPUTER performance ,COMPUTER programming - Abstract
In this paper, we propose a multilayer self-diagnosis framework for network services within the software-defined networking and network functions virtualization environments. The framework encompasses 3 main contributions: (1) the definition of multilayered templates to identify the components to supervise across the physical, logical, virtual, and service layers. These templates are also finer-granular, extendable, and machine-readable; (2) a topology-aware and a service-aware self-modeling module that takes as input the templates, instantiates them, and generates an on-the-fly diagnosis model, which includes the physical, logical, and the virtual dependencies of network services; (3) a topology-aware and a service-aware root cause analysis approach that takes into account the network services views and their underlying network resources observations within the aforementioned layers to automate the diagnosis of programmable networks. We also present extensive simulations to prove and evaluate the following aspects: a fully automated diagnosis model generation and a fine-grained and reduced uncertainty diagnosis of the root cause for network services failures including those of their underlying resources. We include in this extended paper relevant state-of-the-art on topology and service aware diagnosis approaches for different types of network technologies, a deeper insight of our approach and problem formalization, and additional results. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
42. Decomposition of Flowcharts Using a Digraph Hierarchical Property.
- Author
-
Kato, June and Miyake, Nobuhisa
- Subjects
DIRECTED graphs ,FLOW charts ,GRAPHIC methods ,ALGORITHMS ,COMPUTER programming ,PROGRAMMING languages ,ELECTRONIC data processing - Abstract
This paper discusses the hierarchical properties of digraphs and proposes flowchart decomposition algorithms using these properties. The hierarchical structure discussed here is based on only the topological relation- skips between vertices and the inclusive relationship among sets of vertices. When the given flowchart represents the control flow of a computer program, this hierarchical structure also shows strong correspondence between these inclusive relationships and program modules. The proposed flowchart decomposition algorithm consists of one algorithm for detecting modules and another algorithm for further decomposing the detected modules. This paper focuses on the former algorithm. Because the hierarchical structure discussed in this paper corresponds to the traditional program modules (e.g., function or procedure) of most of the computer programming languages, the module-detection algorithm can be used not only for decomposing flowchart but for more general purposes, including module design. Two types of module-detection algorithm are discussed in this paper. One detects all modules, and the other detects a restricted subset of modules from the standpoint of flow-chart decomposition. We then demonstrate with a practical example that the latter algorithm is ten times faster than the former. [ABSTRACT FROM AUTHOR]
- Published
- 1995
- Full Text
- View/download PDF
43. An elementary finite element exercise to stimulate computational thinking in engineering education.
- Subjects
ENGINEERING education ,FINITE element method ,SOFTWARE engineers ,SOFTWARE engineering ,COMPUTER programming - Abstract
In this paper, an elementary exercise that can strengthen computational thinking in engineering analysis and design is outlined and discussed. The exercise is a simple finite element assignment designed for the M.Sc. students in Mechanical and Civil Engineering. It comprises a two‐member frame with a variety of loading that can be solved manually without using any computer programming. Individual data is allocated to each student, and they have been asked to report their results like reaction forces, reaction moments and stresses. Thereafter they have modelled the same frame structure in commercially available finite element method (FEM) software ABAQUS, from which a substantial output file is created for such a simple problem. In the final stage, each student highlights the ABAQUS results that can be compared and commented upon with the results obtained from their manual calculations. Since the assignment is individual, it provides a justifiable connection between computational thinking of an individual mind, with printed outputs of a complicated FEM software. Implementing this assignment in M.Sc. finite element course at Aberdeen University has been very successful. This enabled individual students to relate their computational thinking, with the results of complicated FEM software in engineering. It is concluded that such exercises can stimulate computational thinking in engineering education. Moreover, they can be used in other engineering fields where FEM is applicable, either at the university level or Professional and Career Development courses in engineering. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
44. Software fault localisation: a systematic mapping study.
- Author
-
Zakari, Abubakar, Sai Peck Lee, Alam, Khubaib Amjad, and Ahmad, Rodina
- Subjects
COMPUTER software ,DEBUGGING ,COMPUTATIONAL complexity ,SOFTWARE engineers ,COMPUTER programming - Abstract
Software fault localisation (SFL) is recognised to be one of the most tedious, costly, and critical activities in program debugging. Due to the increase in software complexity, there is a huge interest in advanced SFL techniques that aid software engineers in locating program bugs. This interest paves a way to the existence of a large amount of literature in the SFL research domain. This study aims to investigate the overall research productivity, demographics, and trends shaping the landscape of SFL research domain. The research also aims to classify existing fault localisation techniques and identify trends in the field of study. Accordingly, a systematic mapping study of 273 primary selected studies is conducted with the adoption of an evidence-based systematic methodology to ensure coverage of all relevant studies. The results of this systematic mapping study show that SFL research domain is gaining more attention since 2010, with an increasing number of publications per year. Three main research facets were identified, i.e. validation research, evaluation research, and solution research, with solution research type getting more attention. Hence, various contribution facets were identified as well. In totality, general demographics of SFL research domain were highlighted and discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
45. High‐order ADE scheme for solving the fluid diffusion equation in non‐uniform grids and its application in coupled hydro‐mechanical simulation.
- Author
-
Prassetyo, Simon Heru and Gutierrez, Marte
- Subjects
HEAT equation ,GRID computing ,STABILITY theory ,APPROXIMATION theory ,COMPUTER programming - Abstract
Summary: To improve the stability and efficiency of explicit technique, one proposed method is to use an unconditionally stable alternating direction explicit (ADE) scheme. However, the standard ADE scheme is only moderately accurate and restricted to uniform grids. This paper derives a novel high‐order ADE scheme capable of solving the fluid diffusion equation in non‐uniform grids. The new scheme is derived by performing a fourth‐order finite difference approximation to the spatial derivatives of the diffusion equation in non‐uniform grid. The implicit Crank‐Nicolson technique is then applied to the resulting approximation, and the subsequent equation is split into two alternating direction sweeps, giving rise to a new high‐order ADE scheme. Because the new scheme can be potentially applied in coupled hydro‐mechanical (H‐M) simulation, the pore pressure solutions from the new scheme are then sequentially coupled with an existing geomechanical simulator in the computer program Fast Lagrangian Analysis of Continua. This coupling procedure is called the sequentially explicit coupling technique based on the fourth‐order ADE scheme (SEA‐4). Verifications of well‐known consolidation problems showed that the new ADE scheme and SEA‐4 can reduce computer runtime by 46% to 75% to that of Fast Lagrangian Analysis of Continua's basic scheme. At the same time, the techniques still maintained average percentage error of 1.6% to 3.5% for pore pressure and 0.2% to 1.5% for displacement solutions and were still accurate under typical grid non‐uniformities. This result suggests that the new high‐order ADE scheme can provide an efficient explicit technique for solving the flow equation of a coupled H‐M problem, which will be beneficial for large‐scale and long‐term H‐M problems in geoengineering. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
46. Towards a deeper understanding of test coverage.
- Author
-
Kanstrén, Teemu
- Subjects
TESTING ,COMPUTER software ,OPEN source software ,COMMERCIAL product testing ,SOFTWARE maintenance ,COMPUTER programming - Abstract
Test coverage is traditionally considered as how much of the code is covered by the test suite in whole. However, test suites typically contain different types of tests with different roles, such as unit tests, integration tests and functional tests. As traditional measures of test coverage make no distinction between the different types of tests, the overall view of test coverage is limited to what is covered by the tests in general. This paper proposes a quantitative way to measure the test coverage of the different parts of the software at different testing levels. It is also shown how this information can be used in software maintenance and development to further evolve the test suite and the system under test. The technique is applied to an open-source project to show its application in practice. Copyright © 2007 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
47. A comparison of concurrent programming and cooperative multithreading under load balancing applications.
- Author
-
Maris, Justin T., Keen, Aaron W., Takashi Ishibara, and Olsson, Ronald A.
- Subjects
COMPUTER programming ,THREADS (Computer programs) ,COMPUTER multitasking ,WORKLOAD of computer networks ,SIMULTANEOUS multithreading processors ,PARALLEL programming ,DISTRIBUTED computing ,SYNCHRONIZATION ,COMPUTER software - Abstract
Two models of thread execution are the general concurrent programming execution model (CP) and the cooperative multithreading execution model (CM). CP provides nondeterministic thread execution where context switches occur arbitrarily. CM provides threads that execute one at a time until they explicitly choose to yield the processor. This paper focuses on a classic application to reveal the advantages and disadvantages of load balancing during thread execution under CP and CM styles; results from a second classic application were similar. These applications are programmed in two different languages (SR and Dynamic C) on different hardware (standard PCs and embedded system controllers). An SR-like run-time system, DesCaRTeS, was developed to provide interprocess communication for the Dynamic C implementations. This paper compares load balancing and non-load balancing implementations; it also compares CP and CM style implementations. The results show that in cases of very high or very low workloads, load balancing slightly hindered performance; and in cases of moderate workload, both SR and Dynamic C implementations of load balancing generally performed well. Further, for these applications, CM style programs outperform CP style programs in some cases, but the opposite occurs in some other cases. This paper also discusses qualitative tradeoffs between CM style programming and CP style programming for these applications. Copyright © 2004 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
48. Empirical analysis of the relationship between CC and SLOC in a large corpus of Java methods and C functions.
- Author
-
Landman, Davy, Serebrenik, Alexander, Bouwers, Eric, and Vinju, Jurgen J.
- Subjects
SOFTWARE maintenance ,COMPUTER programming ,SOFTWARE measurement ,SOURCE code ,LOSSLESS data compression - Abstract
Measuring the internal quality of source code is one of the traditional goals of making software development into an engineering discipline. Cyclomatic complexity (CC) is an often used source code quality metric, next to source lines of code (SLOC). However, the use of the CC metric is challenged by the repeated claim that CC is redundant with respect to SLOC because of strong linear correlation. We conducted an extensive literature study of the CC/SLOC correlation results. Next, we tested correlation on large Java (17.6 M methods) and C (6.3 M functions) corpora. Our results show that linear correlation between SLOC and CC is only moderate as a result of increasingly high variance. We further observe that aggregating CC and SLOC as well as performing a power transform improves the correlation. Our conclusion is that the observed linear correlation between CC and SLOC of Java methods or C functions is not strong enough to conclude that CC is redundant with SLOC. This conclusion contradicts earlier claims from literature but concurs with the widely accepted practice of measuring of CC next to SLOC. Copyright © 2015 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
49. RJ: a Java package providing JR-like concurrent programming.
- Author
-
Olsson, Ronald A. and Williamson, Todd
- Subjects
JR (Computer program language) ,COMPUTER software development ,PROGRAMMING languages ,VIRTUAL machine systems ,COMPUTER programming ,JAVA programming language - Abstract
The JR concurrent programming language extends Java with a richer concurrency model, by adding several new types and statements. JR provides dynamic remote virtual machine creation, dynamic remote object creation, remote method invocation, dynamic process creation, rendezvous, asynchronous message passing, semaphores, concurrent invocation, and shared variables. This paper presents RJ, a package for Java that provides JR-like features. The paper gives an overview of RJ and its key features; describes the implications of RJ's design, including how RJ provides additional, useful flexibility; discusses the implementation of RJ; and gives qualitative and quantitative evaluations of our work with respect to feasibility and usability, experimentation, migration, and performance. RJ has been successful in meeting these goals and in providing insight into the trade-offs between using a concurrent programming language versus using the equivalent concurrent package. Our work has yielded a few surprises in dealing with some concurrent programming language features, in understanding the run-time performances of JR versus RJ programs, and in obtaining some additional, useful flexibility for concurrent programming applications. Copyright © 2015 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
50. The SIPSim implicit parallelism model and the SkelGIS library.
- Author
-
Coullon, Hélène and Limet, Sébastien
- Subjects
COMPUTER simulation ,PARALLEL computers ,COMPUTER programming ,PERSONNEL management ,C++ - Abstract
Scientific simulations give rise to complex codes where data size and computation time become very important issues, and sometimes a scientific barrier. Thus, parallelization of scientific simulations becomes a significant work. Many time and human efforts are deployed to produce efficient parallel programs. But still, many simulations could not be parallelized because of lack of time to learn parallel programming or lack of human resources. Therefore, aiding parallelization through abstracted parallelism or implicit parallelism has become a main topic in computer science. Many implicit parallelism solutions have been proposed such as algorithmic skeletons libraries, domain-specific languages or specific libraries. In this paper is introduced a new type of solution to give a totally transparent access to parallel programming for non-computer scientists of the domain of numerical simulations. This solution is an implicit parallelism model, called Structured Implicit Parallelism on scientific Simulations (SIPSim). After a description of the SIPSim model, this paper presents the implementation of the model, as a C++ templated library called SkelGIS, for two different cases of simulations: simulations on Cartesian meshes and simulations of two physical phenomena linked through a network. For each case, the implementation of the SIPSim components are described, and a simple simulation example is given. SkelGIS is then evaluated on two real cases, one for each case, first on the resolution of shallow water equations and second on an arterial blood flow simulation. To clearly state on SkelGIS performance and its ease of programming, different experiments on both cases are evaluated. Copyright © 2015 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.