40 results
Search Results
2. Constructing Envelopes: How Institutional Custodians Can Tame Disruptive Algorithms.
- Author
-
Marti, Emilio, Lawrence, Thomas B., and Steele, Christopher W. J.
- Subjects
COMPUTER algorithms ,OCCUPATIONAL roles ,SECURITIES trading ,ORGANIZATIONAL governance ,VALUES (Ethics) ,HUMAN-artificial intelligence interaction - Abstract
The infusion of algorithms into organizational fields—accelerated by advances in artificial intelligence—can have disruptive effects that trigger defensive responses. One important response involves establishing a boundary around an algorithm to delimit its interactions with its environment—in engineering terms, constructing an "envelope." Yet, we know little about the process through which such envelopes are constructed. We address this issue by exploring how institutional custodians construct envelopes around disruptive algorithms. We empirically examine custodians' responses to the high-frequency trading algorithms that disrupted the field of U.S. securities trading, focusing on the years 2009–2016. Our inductive analysis shows that custodians created an envelope with interconnected normative, governance, and practice "layers" that jointly constrained high-frequency trading. Each layer emerged as custodians "coupled" one element of the field (e.g., its values) to one aspect of the disruptive algorithms (e.g., their impacts). Our study contributes to research on the social dynamics of algorithms by generating novel theory of how envelopes around algorithms are constructed, and to research on institutional custodianship by highlighting the constructing of envelopes as a custodial response to a wide range of threats—including, but not restricted to, disruptive algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. professional activities.
- Subjects
ASSOCIATIONS, institutions, etc. ,CONFERENCES & conventions ,COMPUTER programming ,COMPUTER algorithms ,COMPUTER science - Abstract
The article presents information on professional activities within the Association for Computing Machines (ACM). ACM Southeastern Regional Meeting will be conducted on April 18-20, the ACM Southeastern Region will hold a meeting at the Sheraton-Nashville Hotel in Nashville, Tennessee. The Association for Computing Machinery will sponsor the Sixth International Users Conference on May 14-17 at the Sheraton-Anaheim Hotel in Anaheim California. Coast Community. The Boy Scouts of America (BSA) recently published a booklet for use in its merit badge program on Computing. BSA is now looking for computer professionals to help Scouts attain the required skilk for this badge interested persons should contact their local Boy Scout offices.
- Published
- 1974
4. Critical literacy for a posthuman world: When people read, and become, with machines.
- Author
-
Leander, Kevin M. and Burriss, Sarah K.
- Subjects
CRITICAL literacy ,LITERACY education ,ARTIFICIAL intelligence ,COMPUTER algorithms ,EDUCATIONAL technology ,SOCIAL media ,ADULTS ,PROFESSIONAL education - Abstract
Computational objects (eg, algorithms, bots, surveillance technology and data) have become increasingly present in our daily lives and are consequential for our changing relations to texts, multimodality and identity. Yet, our current theories of literacy, and especially the prevalence of mediational and representational perspectives, are inadequate to account for these changing relations. What are the implications for critical literacy education when it takes seriously computational agents that interact, produce and process texts? While such work is only beginning in education, scholars in other fields are increasingly writing about how AI and algorithmic mediation are changing the landscape of online intra‐action, and business strategies and tactics for working with AI are advancing far ahead of critical literacy education. Drawing on our own and others' research into non‐human actors online, and building on posthuman theories of networks, heterogeneous actants and the assemblage, in this conceptual paper, we sketch some of the forms of critical consciousness that media education might provide in this new mixed landscape. Practitioner NotesWhat is already known about this topic AI is a hot topic in education and in public discourse, but critical literacy theories have not sufficiently accounted for how AI and computational agents change what it means to be "critically literate."Technology is an important force in shaping (and is also shaped by) literacy practices and identity.Corporate actors have an enormous influence on the texts we read and write, but this influence is often hidden.What this paper adds We bridge between critical literacy studies and posthumanist theory to conceptualize critical posthuman literacy.We argue for re‐imagining what texts, multimodality and identity are and do in the age of AI.We pose new questions of our texts and ourselves, informed by posthuman critical literacy.Implications for practice and/or policy Today's readers and composers must be able to identify and interrogate networks of computational and human agents that permeate literacy practices.Beyond identifying and understanding computational agents, posthuman critical literacy necessitates that people can actively build more ethical assemblages with computational agents. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
5. Reproducibility report: Team SegFAUlt @ SCC 2016.
- Author
-
Ditter, Alexander, Laukemann, Jan, and Oehlrich, Benedikt
- Subjects
- *
CLUSTER analysis (Statistics) , *SUPERCOMPUTERS , *STUDENTS , *METAGENOMICS , *PARALLEL computers , *COMPUTER algorithms , *COMPUTER software - Abstract
This publication is based on the reproducibility task of the 2016 Student Cluster Competition (SCC) at the Supercomputing Conference (SC) 2016, Salt Lake City, Utah, USA. The SCC is a 48 h non-stop supercomputing challenge, where teams from all over the world, consisting of six undergraduate students each, run and optimize real-world scientific workloads and applications. For the first time in 2016, one of the tasks was to reproduce the results of a scientific paper. Besides the software part of the competition the students also have to decide on a suitable hardware configuration and assemble their cluster on-site. The only limitation is the power limit of 3120 W. The reproducibility task, which is described in this publication, was based on the 2015 SC paper “A parallel connectivity algorithm for de Bruijn graphs in metagenomic applications” by Patrick Flick et al. presenting the first non-lossy parallel decomposition of metagenomic assembly. The students had to (i) report general information about the provided datasets, (ii) create similar graphs about communication and computation time to those in the original paper using their own results, (iii) compare them to conclusion of the thesis being reproduced and (iv) perform a strong scaling study on the dataset, including the generation of graphs comparable to the original ones. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
6. Computer Vision and Augmented Reality for Human-Centered Fatigue Crack Inspection.
- Author
-
Mojidra, Rushil, Li, Jian, Mohammadkhorasani, Ali, Moreu, Fernando, Bennett, Caroline, and Collins, William
- Subjects
FATIGUE cracks ,AUGMENTED reality ,BRIDGE inspection ,COMPUTER vision ,COMPUTER algorithms ,STRUCTURAL health monitoring ,DECISION making - Abstract
A significant percentage of bridges in the United States are serving beyond their 50-year design life, and many of them are in poor condition, making them vulnerable to fatigue cracks that can result in catastrophic failure. However, current fatigue crack inspection practice based on human vision is time-consuming, labor intensive, and prone to error. We present a novel human-centered bridge inspection methodology to enhance the efficiency and accuracy of fatigue crack detection by employing advanced technologies including computer vision and augmented reality (AR). In particular, a computer vision-based algorithm is developed to enable near-real-time fatigue crack detection by analyzing structural surface motion in a short video recorded by a moving camera of the AR headset. The approach monitors structural surfaces by tracking feature points and measuring variations in distances between feature point pairs to recognize the motion pattern associated with the crack opening and closing. Measuring distance changes between feature points, as opposed to their displacement changes before this improvement, eliminates the need of camera motion compensation and enables reliable and computationally efficient fatigue crack detection using the nonstationary AR headset. In addition, an AR environment is created and integrated with the computer vision algorithm. The crack detection results are transmitted to the AR headset worn by the bridge inspector, where they are converted into holograms and anchored on the bridge surface in the 3D real-world environment. The AR environment also provides virtual menus to support human-in-the-loop decision-making to determine optimal crack detection parameters. This human-centered approach with improved visualization and human–machine collaboration aids the inspector in making well-informed decisions in the field in a near-real-time fashion. The proposed crack detection method is comprehensively assessed using two laboratory test setups for both in-plane and out-of-plane fatigue cracks. Finally, using the integrated AR environment, a human-centered bridge inspection is conducted to demonstrate the efficacy and potential of the proposed methodology. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. A Novel Dynamic Adjusting Algorithm for Load Balancing and Handover Co-Optimization in LTE SON.
- Author
-
Li, Wen-Yu, Zhang, Xiang, Jia, Shu-Cong, Gu, Xin-Yu, Zhang, Lin, Duan, Xiao-Yu, and Lin, Jia-Ru
- Subjects
COMPUTER algorithms ,LONG-Term Evolution (Telecommunications) ,WIRELESS Internet ,MATHEMATICAL optimization - Abstract
With the development of mobile internet and multi-media service, advanced techniques need to be applied in wireless network to improve user experience. Long term evolution (LTE) systems, which can offer up to 100Mbps downlink date rates, have been deployed in USA and Korea. However, because plenty of complex physical layer algorithms are utilized, network planning and optimization become heavy burdens for LTE network operators. Self-organizing network (SON) is a promising method to overcome this problem by automatically selecting and adjusting key parameters in LTE systems. In this paper, we present a dynamic adjusting algorithm to improve both handover and load balancing performance by introducing a weighted co-satisfaction factor (CSF). Analysis and system level simulation are conducted to exhibit the performance improvement of the proposed scheme. Results show that the proposed method outperforms the conventional solutions in terms of the network handover success ratio and load balancing gains significantly. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
8. Quantitative evaluation of footwear evidence: Initial workflow for an end‐to‐end system.
- Author
-
Venkatasubramanian, Gautham, Hegde, Vighnesh, Lund, Steven P., Iyer, Hari, and Herman, Martin
- Subjects
WORKFLOW ,FOOTWEAR ,CRIME scenes ,COGNITIVE bias ,COMPUTER algorithms - Abstract
In the United States, footwear examiners make decisions about the sources of crime scene shoe impressions using subjective criteria. This has raised questions about the accuracy, repeatability, reproducibility, and scientific validity of footwear examinations. Currently, most footwear examiners follow a workflow that compares a questioned and test impression with regard to outsole design, size, wear, and randomly acquired characteristics (RACs). We augment this workflow with computer algorithms and statistical analysis so as to improve in the following areas: (1) quantifying the degree of correspondence between the questioned and test impressions with respect to design, size, wear, and RACs, (2) reducing the potential for cognitive bias, and (3) providing an empirical basis for examiner conclusions by developing a reference database of case‐relevant pairs of impressions containing known mated and known nonmated impressions. Our end‐to‐end workflow facilitates all three of these points and is directly relatable to current practice. We demonstrate the workflow, which includes obtaining and interpreting outsole pattern scores, RAC comparison scores, and final scores, on two scenarios—a pristine example (involving very high quality Everspry EverOS scanner impressions) and a mock crime scene example that more closely resembles real casework. These examples not only demonstrate the workflow but also help identify the algorithmic, computational, and statistical challenges involved in improving the system for eventual deployment in casework. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. Real-life examination timetabling.
- Author
-
Müller, Tomáš
- Subjects
SCHEDULING ,UNIVERSITIES & colleges ,COMPUTER algorithms ,COMPUTER software - Abstract
An examination timetabling problem at a large American university is presented. Although there are some important differences, the solution approach is based on the ITC 2007 winning solver which is integrated in the open source university timetabling system UniTime. In this work, nine real world benchmark data sets are made publicly available and the results on four of them are presented in this paper. A new approach to further decreasing the number of student conflicts by allowing some exams to be split into multiple examination periods is also studied. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
10. Development of an Efficient Regional Four-Dimensional Variational Data Assimilation System for WRF.
- Author
-
Zhang, Xin, Huang, Xiang-Yu, Liu, Jianyu, Poterjoy, Jonathan, Weng, Yonghui, Zhang, Fuqing, and Wang, Hongli
- Subjects
DATA management ,METEOROLOGICAL research ,WEATHER forecasting ,SCALABILITY ,COMPUTER algorithms - Abstract
This paper presents the development of a single executable four-dimensional variational data assimilation (4D-Var) system based on the Weather Research and Forecasting (WRF) Model through coupling the variational data assimilation algorithm (WRF-VAR) with the newly developed WRF tangent linear and adjoint model (WRFPLUS). Compared to the predecessor Multiple Program Multiple Data version, the new WRF 4D-Var system achieves major improvements in that all processing cores are able to participate in the computation and all information exchanges between WRF-VAR and WRFPLUS are moved directly from disk to memory. The single executable 4D-Var system demonstrates desirable acceleration and scalability in terms of the computational performance, as demonstrated through a series of benchmarking data assimilation experiments carried out over a continental U.S. domain. To take into account the nonlinear processes with the linearized minimization algorithm and to further decrease the computational cost of the 4D-Var minimization, a multi-incremental minimization that uses multiple horizontal resolutions for the inner loop has been developed. The method calculates the innovations with a high-resolution grid and minimizes the cost function with a lower-resolution grid. The details regarding the transition between the high-resolution outer loop and the low-resolution inner loop are introduced. Performance of the multi-incremental configuration is found to be comparable to that with the full-resolution 4D-Var in terms of 24-h forecast accuracy in the week-long analysis and forecast experiment over the continental U.S. domain. Moreover, the capability of the newly developed multi-incremental 4D-Var system is further demonstrated in the convection-permitting analysis and forecast experiment for Hurricane Sandy (2012), which was hardly computationally feasible with the predecessor WRF 4D-Var system. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
11. Algorithms for Combined Inter- and Intra-Task Dynamic Voltage Scaling.
- Author
-
Seo, Hyungjung, Seo, Jaewon, and Kim, Taewhan
- Subjects
COMPUTER algorithms ,ELECTRIC power consumption ,EMBEDDED computer systems ,ELECTRIC potential ,DISTRIBUTED computing - Abstract
Dynamic voltage scaling (DVS) is one of the most effective techniques for reducing energy consumption on battery-operated embedded systems. According to the granularity of units to which voltage scaling is applied, the DVS problem can be divided into two subproblems: (i) inter-task DVS problem and (ii) intra-task DVS problem. A lot of effective DVS techniques have addressed either one of the two subproblems, but none of them have attempted to solve both simultaneously. This paper examines the problem of combined inter- and intra-task DVS, called the combined DVS (CDVS) problem. We solve the CDVS problem in two embedded system domains: one is systems with a sleep state and the other without sleep state. For systems without sleep state, we propose a close-to-optimal algorithm for the CDVS problem. We show that the algorithm is optimal when the power is a quadratically increasing function of the system's clock speed or the applied voltage level. For systems with a sleep state, we propose a refinement algorithm that fine-tunes the solution to the CDVS problem without sleep state to further reduce energy consumption by exploiting sleep state. Experimental results show that our proposed CDVS algorithm without sleep state is able to reduce the energy consumption by 12.5% on average over the results by the method that sequentially performs two existing inter- and intra-task DVS techniques, which are both optimal under no sleep state. Furthermore, our CDVS algorithm with a sleep state can reduce the energy consumption by 7.1% on average over the results by the conventional representative method that utilizes sleep state, but does not consider intra- and inter-task DVS simultaneously. [ABSTRACT FROM PUBLISHER]
- Published
- 2012
- Full Text
- View/download PDF
12. Efficient Image Chaotic Encryption Algorithm with No Propagation Error.
- Author
-
Awad, Abir and Awad, Dounia
- Subjects
DATA encryption ,PERTURBATION theory ,COMPUTER security ,COMPUTER algorithms ,OPTICAL images - Abstract
Many chaos-based encryption methods have been presented and discussed in the last two decades, but very few of them are suitable to secure transmission on noisy channels or respect the standard of the National Institute of Standards and Technology (NIST). This paper tackles the problem and presents a novel chaos-based cryptosystem for secure transmitted images. The proposed cryptosystem overcomes the drawbacks of existing chaotic algorithms such as the Socek, Xiang, Yang, and Wong methods. It takes advantage of the increasingly complex behavior of perturbed chaotic signals. The perturbing orbit technique improves the dynamic statistical properties of generated chaotic sequences, permits the proposed algorithm reaching higher performance, and avoids the problem of error propagation. Finally, many standard tools, such as NIST tests, are used to quantify the security level of the proposed cryptosystem, and experimental results prove that the suggested cryptosystem has a high security level, lower correlation coefficients, and improved entropy. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
13. Real-Time Suitable Predictive Control Using SPaT Information from Automated Traffic Lights.
- Author
-
Bhat, Pradeep Krishna and Chen, Bo
- Subjects
TRAFFIC signs & signals ,COMPUTER algorithms ,INFORMATION retrieval ,UNCERTAINTY - Abstract
Traffic intersections throughout the United States combine fixed, semi-actuated, and fully actuated intersections. In the case of the semi-actuated and actuated intersections, uncertainties are considered in phase duration. These uncertainties are due to car waiting queues and pedestrian crossing. Intelligent transportation systems deployed in traffic infrastructure can communicate Signal and Phase Timing messages (SPaT) to vehicles approaching intersections. In the connected and automated vehicle ecosystem, the fuel savings potential has been explored. Prior studies have predominantly focused on fixed time control for the driver. However, in the case of actuated signals, there is a different and significant challenge due to the randomness caused by uncertainties. We have developed a predictive control using the SPaT information communicated from the actuated traffic intersections. The developed MPC-based algorithm was validated using model-based design platforms such as AMBER
® , Autonomie® , MATLAB® , and SIMULINK® . It was observed that the proposed algorithm can save energy in a single phase, in multiple phase scenarios, and in compelled stopping at stop signs when employed considering communications. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
14. Impact of ASOS Real-Time Quality Control on Convective Gust Extremes in the USA.
- Author
-
Cook, Nicholas John
- Subjects
THUNDERSTORMS ,METEOROLOGY ,COMPUTER algorithms ,DATA analysis - Abstract
Most damage to buildings across the contiguous United States, in terms of number and total cost, is caused by gusts in convective events associated with thunderstorms. Their assessment relies on the integrity of meteorological observations. This study examines the impact on risk due to valid gust observations culled erroneously by the real-time quality control algorithm of the US Automated Surface Observation System (ASOS) after 2013. ASOS data before 2014 are used to simulate the effect of this algorithm at 450 well-exposed stations distributed across the contiguous USA. The peak gust is culled in around 10% of these events causing significant underestimates of extreme gusts. The full ASOS record, 2000–2021, is used to estimate and map the 50-year mean recurrence interval (MRI) gust speeds, the conventional metric for structural design. It is concluded that recovery of erroneously culled observations is not possible, so the only practical option to eliminate underestimation is to ensure that the 50-year MRI gust speed at any given station is not less than the mean for nearby surrounding stations. This also affects stations where values are legitimately lower than their neighbors, which represents the price that must be paid to eliminate unacceptable risk. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
15. The shaping of a national ignition campaign pulsed waveform
- Author
-
Brunton, Gordon, Erbert, Gaylen, Browning, Don, and Tse, Eddy
- Subjects
- *
NEUTRAL beams , *ULTRAVIOLET lasers , *INERTIAL confinement fusion , *PULSE shaping (Digital communications) , *COMPUTER algorithms - Abstract
Abstract: The National Ignition Facility (NIF) at Lawrence Livermore National Laboratory is a stadium-sized facility containing a 192 beam, 1.8MJ, 500TW ultraviolet laser system used for inertial confinement fusion research. For each experimental shot, NIF must deliver a precise amount of laser power on the target for successful and efficient target ignition, and these characteristics vary depending on the physics of the particular campaign. The precise temporal shape, energy and timing characteristics of a pulsed waveform target interaction are key components in meeting the experimental goals. Each NIF pulse is generated in the Master Oscillator Room (MOR) using an electro-optic modulator to vary the intensity of light in response to an electrical input. The electrical drive signal to the modulator is produced using a unique, high-performance arbitrary waveform generator (AWG). This AWG sums the output of 140 electrical impulse generators, each producing a 300ps pulse width Gaussian signal separated in time by 250ps. By adjusting the amplitudes and summing the 140 impulses, a pulsed waveform can be sculpted from a seed 45ns square pulse. Using software algorithms written for NIF''s Integrated Computer Control System (ICCS), the system is capable of autonomously shaping 48 unique experimental pulsed waveforms for each shot that have demonstrated up to 275:1 contrast ratio with ±3% absolute error averaged over any 2ns interval, meeting the stringent pulse requirements needed to achieve ignition. In this paper, we provide an overview of the pulse shaping system, software algorithms and associated challenges that have been overcome throughout the evolution of the controls. [Copyright &y& Elsevier]
- Published
- 2012
- Full Text
- View/download PDF
16. A COMPUTATIONAL APPROACH TO SITUATIONAL AWARENESS.
- Author
-
Sherwin, Jason S.
- Subjects
- *
COMPUTATIONAL complexity , *SITUATIONAL awareness , *COMPUTER algorithms , *INFORMATION storage & retrieval systems , *ELECTRONIC records - Abstract
This paper proposes a method for accomplishing computational situational awareness (SA). The specific case of the Iraq conflict after Saddam Hussein's deposal is used as an example to demonstrate the computational version of a policy-maker's SA in such a case. This computational SA is then compared to the reports of the United States Department of Defense in which a qualitative account of the actual policy-makers' SA is given. From this comparison, it is seen that the implementation of the computing algorithms used here delivered valid results. Consequently, this study opens a new avenue of research in which computer-based calculation can aid policy-makers in making decisions on complex matters of international policy. [ABSTRACT FROM AUTHOR]
- Published
- 2010
17. SPIRS: A Web-based image retrieval system for large biomedical databases
- Author
-
Hsu, William, Antani, Sameer, Long, L. Rodney, Neve, Leif, and Thoma, George R.
- Subjects
- *
IMAGE retrieval , *COMPUTERS in medicine , *MEDICAL imaging systems , *WEBSITES , *IMAGE storage & retrieval systems , *SPINE , *COMPUTER algorithms - Abstract
Abstract: Purpose: With the increasing use of images in disease research, education, and clinical medicine, the need for methods that effectively archive, query, and retrieve these images by their content is underscored. This paper describes the implementation of a Web-based retrieval system called SPIRS (Spine Pathology & Image Retrieval System), which permits exploration of a large biomedical database of digitized spine X-ray images and data from a national health survey using a combination of visual and textual queries. Methods: SPIRS is a generalizable framework that consists of four components: a client applet, a gateway, an indexing and retrieval system, and a database of images and associated text data. The prototype system is demonstrated using text and imaging data collected as part of the second U.S. National Health and Nutrition Examination Survey (NHANES II). Users search the image data by providing a sketch of the vertebral outline or selecting an example vertebral image and some relevant text parameters. Pertinent pathology on the image/sketch can be annotated and weighted to indicate importance. Results: During the course of development, we explored different algorithms to perform functions such as segmentation, indexing, and retrieval. Each algorithm was tested individually and then implemented as part of SPIRS. To evaluate the overall system, we first tested the system''s ability to return similar vertebral shapes from the database given a query shape. Initial evaluations using visual queries only (no text) have shown that the system achieves up to 68% accuracy in finding images in the database that exhibit similar abnormality type and severity. Relevance feedback mechanisms have been shown to increase accuracy by an additional 22% after three iterations. While we primarily demonstrate this system in the context of retrieving vertebral shape, our framework has also been adapted to search a collection of 100,000 uterine cervix images to study the progression of cervical cancer. Conclusions: SPIRS is automated, easily accessible, and integratable with other complementary information retrieval systems. The system supports the ability for users to intuitively query large amounts of imaging data by providing visual examples and text keywords and has beneficial implications in the areas of research, education, and patient care. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
18. Thirty Five Years of Computer Cartograms.
- Author
-
Tobler, Waldo
- Subjects
CARTOGRAPHY ,MERCATOR projection (Cartography) ,COMPUTER algorithms ,NOMOGRAPHY (Mathematics) ,NUMERICAL analysis - Abstract
The notion of a cartogram is reviewed. Then, based on a presentation from the 1960s, a direct and simple introduction is given to the design of a computer algorithm for the construction of contiguous value-by-area cartograms. As an example, a table of latitude/longitude to rectangular plane coordinates is included for a cartogram of the United States, along with Tissot's measures for this map projection. This is followed by a short review of the subsequent history of the subject and includes citation of algorithms proposed by others. In contrast to the usual geographic map, the most common use of cartograms is solely for the display and emphasis of a geographic distribution. A second use is in analysis, as a nomograph or problem-solving device similar in use to Mercator's projection, or in the transform-solve-invert paradigm. Recent innovations by computer scientists modify the objective and suggest variation similar to Airy's (1861) “balance of errors” idea for map projections. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
19. Unregulated Algorithmic Trading: Testing the Boundaries of the European Union Algorithmic Trading Regime.
- Author
-
Pereira, Clara Martins
- Subjects
COMPUTER algorithms ,NETWORK routers ,JURISDICTION - Abstract
Trading in modern equity markets has come to be dominated by machines and algorithms. However, there is significant concern over the impact of algorithmic trading on market quality and a number of jurisdictions have moved to address the risks associated with this new type of trading. The European Union has been no exception to this trend. This article argues that while the European Union algorithmic trading regime is often perceived as a tough response to the challenges inherent in machine trading, it has one crucial shortcoming: it does not regulate the simpler, basic execution algorithms used in automated order routers. Yet the same risk generally associated with algorithmic trading activity also arises, in particular, from the use of these basic execution algorithms—as was made evident by the trading glitch that led to the fall of United States securities trader Knight Capital in 2012. Indeed, such risk could even be amplified by the lack of sophistication of these simpler execution algorithms. It is thus proposed that the European Union should amend the objective scope of its algorithmic trading regime by expanding the definition of algorithmic trading under the Markets in Financial Instruments Directive (MiFID II) to include all execution algorithms, regardless of their complexity. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
20. Modern Heuristics and Hybrid Algorithms for Engineering Problems Resolution: Preface.
- Author
-
Begambre, Oscar
- Subjects
HEURISTIC ,COMPUTER algorithms ,MATHEMATICAL transformations ,MATHEMATICAL optimization - Abstract
The article reports on the significance of modern heuristics and hybrid algorithms to solve engineering resolution problems in the U.S. The author mentions that heuristic optimization methods based on imitating natural, biological, social or cultural processes were used by the scientific community because of its ability to explore multimodal and high-dimensional solution spaces. He adds that some heuristic algorithms suffer due to reduced confidence, poor precision and low stability.
- Published
- 2010
- Full Text
- View/download PDF
21. It’s Not the Algorithm, It’s the Data.
- Author
-
Kirkpatrick, Keith
- Subjects
RISK assessment ,PREDICTIVE policing ,RACE discrimination in criminal justice administration ,COMPUTER algorithms ,CRIME statistics - Abstract
The article discusses the impact of biased data on risk assessment and predictive policing. Topics include the use of risk-based assessment tools such as COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) to aid U.S. states in sentencing criminals, concerns regarding the possibility of computerized risk-assessment algorithms penalizing racial minorities by overpredicting their likelihood of recidivism, and the use of poverty, postal codes, and employment status by COMPAS.
- Published
- 2017
- Full Text
- View/download PDF
22. Deciphering Crypto-Discourse: Articulations of Internet Freedom in Relation to The State.
- Subjects
CRYPTOGRAPHY ,INTERNET ,DATA encryption software ,COMPUTER algorithms - Abstract
The article focuses on deciphering cryptography and articulations of internet freedom in relation to the state. Topics include cryptography, refers to encryption software that renders online communication illegible to anyone but its intended recipient and the U.S. government until the 1990s classified encryption as war materiel, which made encryption algorithms illegal to export.
- Published
- 2017
23. Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration.
- Author
-
Howard, Philip N., Woolley, Samuel, and Calo, Ryan
- Subjects
POLITICAL communication ,ELECTION law ,UNITED States presidential election, 2016 ,SOCIAL media & politics ,COMPUTER algorithms - Abstract
Political communication is the process of putting information, technology, and media in the service of power. Increasingly, political actors are automating such processes, through algorithms that obscure motives and authors yet reach immense networks of people through personal ties among friends and family. Not all political algorithms are used for manipulation and social control however. So what are the primary ways in which algorithmic political communication—organized by automated scripts on social media—may undermine elections in democracies? In the US context, what specific elements of communication policy or election law might regulate the behavior of such “bots,” or the political actors who employ them? First, we describe computational propaganda and define political bots as automated scripts designed to manipulate public opinion. Second, we illustrate how political bots have been used to manipulate public opinion and explain how algorithms are an important new domain of analysis for scholars of political communication. Finally, we demonstrate how political bots are likely to interfere with political communication in the United States by allowing surreptitious campaign coordination, illegally soliciting either contributions or votes, or violating rules on disclosure. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
24. ACM FORUM.
- Author
-
Knowles, Brad, Schlafly, Roger, Schultz, Grant D., and Zelvin, Lynn
- Subjects
LETTERS to the editor ,COMPUTER security ,COMPUTER algorithms ,DATA encryption ,PUBLIC key cryptography - Abstract
Presents several letters to the editor about cryptography, published in the November 1, 1992 issue of the journal "Communications of the ACM." Drawbacks of the U.S. National Security Agency's computer algorithms related to data encryption; Reasons for which the public key cryptography algorithm RSA Data Security Inc. is not patentable outside the U.S.; Factors contributing towards cryptographic abuse.
- Published
- 1992
25. Unsupervised regionalization of the United States into landscape pattern types.
- Author
-
Niesterowicz, J., Stepinski, T.F., and Jasiewicz, J.
- Subjects
LANDSCAPES ,LAND cover ,COMPUTER vision ,GEODATABASES ,COMPUTER algorithms - Abstract
We present a pattern-based regionalization of the conterminous US – a partitioning of the country into a number of mutually exclusive and exhaustive regions that maximizes the intra-region stationarity of land cover patterns and inter-region disparity between those patterns. The result is a discretization of the land surface into a number of landscape pattern types (LPTs) – spatial units each containing a unique quasi-stationary pattern of land cover classes. To achieve this goal, we use a recently developed method which utilizes machine vision techniques. First, the entire National Land Cover Dataset (NLCD) is partitioned into a grid of square-size blocks of cells, called motifels. The size of a motifel defines the spatial scale of a local landscape. The land cover classes of cells within a motifel form a local landscape pattern which is mathematically represented by a histogram of co-occurrence features. Using the Jensen–Shannon divergence as a dissimilarity function between patterns we group the motifels into several LPTs. The grouping procedure consists of two phases. First, the grid of motifels is partitioned spatially using a region-growing segmentation algorithm. Then, the resulting segments of this grid, each represented by its medoid, are clustered using a hierarchical algorithm with Ward’s linkage. The broad-extent maps of progressively more generalized LPTs resulting from this procedure are shown and discussed. Our delineated LPTs agree well with the perceptual patterns seen in the NLCD map. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
26. INTERVAL METHODS IN KNOWLEDGE REPRESENTATION.
- Author
-
KREINOVICH, VLADIK
- Subjects
- *
COMPUTER science research , *COMPUTER algorithms , *INFORMATION resources ,ABSTRACTS - Abstract
This section is maintained by Vladik Kreinovich. Please send your abstracts (or copies of papers that you want to see reviewed here) to vladik@cs.utep.edu, or by regular mail to: Vladik Kreinovich, Department of Computer Science, University of Texas at El Paso, El Paso, TX 79968, USA. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
27. A Kernel Testbed for Parallel Architecture, Language, and Performance Research.
- Author
-
Strohmaier, Erich, Williams, Samuel, Kaiser, Alex, Madduri, Kamesh, Ibrahim, Khaled, Bailey, David, and Demmel, James W.
- Subjects
COMPUTER software ,BINARY number system ,COMPUTER algorithms ,COMPILERS (Computer programs) - Abstract
The article presents a research that investigates the computer software and hardware systems to improve performance of fixed binaries locking and arbitrary code-sequences in the U.S. Researchers studied compilers and developed new programming models for use in memory architecture, fixed processors and computational algorithms. They also examined how these kernels behave in a cache memory hierarchy, and how this hierarchy needs to change as to move to highly multicore and exascale systems.
- Published
- 2010
- Full Text
- View/download PDF
28. PATENT LAW'S FUNCTIONALITY MALFUNCTION AND THE PROBLEM OF OVERBROAD, FUNCTIONAL SOFTWARE PATENTS.
- Author
-
COLLINS, KEVIN EMERSON
- Subjects
PATENT law ,FUNCTIONALITY doctrine (Trademarks) ,COMPUTER algorithms ,PATENTABILITY -- Lawsuits & claims ,GOVERNMENT policy ,HISTORY - Abstract
Contemporary software patents are problematic because they are often overbroad. This Article offers a novel explanation of the root cause of this overbreadth. Patent law suffers from a functionality malfunction: the conventional scope-curtailing doctrines of patent law break down and lose their ability to rein in overbroad claims whenever they are brought to bear on technologies, like software, in which inventions are purely functional entities. In addition to identifying the functionality malfunction in the software arts, this Article evaluates the merits of the most promising way of fixing it. Courts can identify algorithms as the metaphorical structure of software inventions and limit claim scope to particular algorithms for achieving a claimed function. However, framing algorithms as the metaphorical structure of software inventions cannot put the scope of software patents on par with the scope of patents in other arts. Most importantly, the recursive nature of algorithms and Gottschalk v. Benson create to-date unappreciated problems. [ABSTRACT FROM AUTHOR]
- Published
- 2013
29. Changes in Federal Information Processing Standard (FIPS) 180-4, Secure Hash Standard.
- Author
-
Dang, Quynh
- Subjects
CRYPTOGRAPHY ,HASHING ,COMPUTER algorithms ,CIPHERS ,GOVERNMENT standards - Abstract
This article describes the changes between Federal Information Processing Standards FIPS 180-3 and FIPS 180-4. FIPS 180-4 specifies two new secure cryptographic hash algorithms: secure hashing algorithms SHA-512/224 and SHA-512/256; it also includes a method for determining initial value(s) for any future SHA-512-based hash algorithm(s). FIPS 180-4 also removes a requirement for the execution of the message length encoding operation. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
30. Taming the Mobile Data Deluge With Drop Zones.
- Author
-
Trestian, Ionut, Ranjan, Supranamaya, Kuzmanovic, Aleksandar, and Nucci, Antonio
- Subjects
SMARTPHONES ,MOBILE apps ,QUALITY of service ,COMPUTER networks ,COMPUTER architecture ,COMPUTER users ,COMPUTER algorithms - Abstract
Human communication has changed by the advent of smartphones. Using commonplace mobile device features, they started uploading large amounts of content that increases. This increase in demand will overwhelm capacity and limits the providers' ability to provide the quality of service demanded by their users. In the absence of technical solutions, cellular network providers are considering changing billing plans to address this. Our contributions are twofold. First, by analyzing user content upload behavior, we find that the user-generated content problem is a user behavioral problem. Particularly, by analyzing user mobility and data logs of 2 million users of one of the largest US cellular providers, we find that: 1) users upload content from a small number of locations; 2) because such locations are different for users, we find that the problem appears ubiquitous. However, we find that: 3) there exists a significant lag between content generation and uploading times, and 4) with respect to users, it is always the same users to delay. Second, we propose a cellular network architecture. Our approach proposes capacity upgrades at a select number of locations called Drop Zones. Although not particularly popular for uploads originally, Drop Zones seamlessly fall within the natural movement patterns of a large number of users. They are therefore suited for uploading larger quantities of content in a postponed manner. We design infrastructure placement algorithms and demonstrate that by upgrading infrastructure in only 963 base stations across the entire US, it is possible to deliver 50% of content via Drop Zones. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
31. Rumors in a Network: Who's the Culprit?
- Author
-
Shah, Devavrat and Zaman, Tauhid
- Subjects
COMPUTER networks ,MAXIMUM likelihood statistics ,PROBABILITY theory ,COMPUTER simulation ,GRAPH theory ,COMPUTER algorithms ,ELECTRIC power transmission - Abstract
We provide a systematic study of the problem of finding the source of a rumor in a network. We model rumor spreading in a network with the popular susceptible-infected (SI) model and then construct an estimator for the rumor source. This estimator is based upon a novel topological quantity which we term rumor centrality. We establish that this is a maximum likelihood (ML) estimator for a class of graphs. We find the following surprising threshold phenomenon: on trees which grow faster than a line, the estimator always has nontrivial detection probability, whereas on trees that grow like a line, the detection probability will go to 0 as the network grows. Simulations performed on synthetic networks such as the popular small-world and scale-free networks, and on real networks such as an internet AS network and the U.S. electric power grid network, show that the estimator either finds the source exactly or within a few hops of the true source across different network topologies. We compare rumor centrality to another common network centrality notion known as distance centrality. We prove that on trees, the rumor center and distance center are equivalent, but on general networks, they may differ. Indeed, simulations show that rumor centrality outperforms distance centrality in finding rumor sources in networks which are not tree-like. [ABSTRACT FROM PUBLISHER]
- Published
- 2011
- Full Text
- View/download PDF
32. NIST Block Cipher Modes of Operation for Confidentiality.
- Author
-
Stallings, William
- Subjects
CIPHERS ,CONFIDENTIAL communications ,DATA encryption ,COMPUTER algorithms ,CRYPTOGRAPHY - Abstract
In this article, we describe the five block cipher modes of operation that have been approved by the National Institute of Standards and Technology (NIST) for confidentiality. Each mode specifies an algorithm for encrypting/decrypting data sequences that are longer than a single block. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
33. Emergency Response Workflow Resource Requirements Modeling and Analysis.
- Author
-
Jiacun Wang, Tepfenhart, William, and Rosca, Daniela
- Subjects
WORK measurement ,EMPLOYEES' workload ,STRATEGIC planning ,EMERGENCY management ,INDUSTRIAL management ,BUSINESS requirements analysis ,COMPUTER algorithms ,BUSINESS models - Abstract
The article focuses on the creation of Workflows Intuitive Formal Approach (WIFA) to address the workflow management requirements in emergency planning and response systems in the U.S. It presents an efficient resource requirement analysis algorithm which formulated to help individuals in deciding the minimum resource set. In addition, it highlights the result of a work survey, extended version of WIFA, and an example that is used to illustrate the use of resource-constrained workflow model. The authors concluded that the approach is applicable for emergency planning and response system workflow specification.
- Published
- 2009
- Full Text
- View/download PDF
34. A new method for class prediction based on signed-rank algorithms applied to Affymetrix microarray experiments.
- Author
-
Rème, Thierry, Hose, Dirk, De Vos, John, Vassal, Aurélien, Poulain, Pierre-Olivier, Pantesco, Véronique, Goldschmidt, Hartmut, and Klein, Bernard
- Subjects
DNA microarrays ,COMPUTER algorithms ,NUCLEIC acid hybridization ,MULTIPLE myeloma ,Y chromosome ,IMMUNOGLOBULINS - Abstract
Background: The huge amount of data generated by DNA chips is a powerful basis to classify various pathologies. However, constant evolution of microarray technology makes it difficult to mix data from different chip types for class prediction of limited sample populations. Affymetrix® technology provides both a quantitative fluorescence signal and a decision (detection call: absent or present) based on signed-rank algorithms applied to several hybridization repeats of each gene, with a per-chip normalization. We developed a new prediction method for class belonging based on the detection call only from recent Affymetrix chip type. Biological data were obtained by hybridization on U133A, U133B and U133Plus 2.0 microarrays of purified normal B cells and cells from three independent groups of multiple myeloma (MM) patients. Results: After a call-based data reduction step to filter out non class-discriminative probe sets, the gene list obtained was reduced to a predictor with correction for multiple testing by iterative deletion of probe sets that sequentially improve inter-class comparisons and their significance. The error rate of the method was determined using leave-one-out and 5-fold cross-validation. It was successfully applied to (i) determine a sex predictor with the normal donor group classifying gender with no error in all patient groups except for male MM samples with a Y chromosome deletion, (ii) predict the immunoglobulin light and heavy chains expressed by the malignant myeloma clones of the validation group and (iii) predict sex, light and heavy chain nature for every new patient. Finally, this method was shown powerful when compared to the popular classification method Prediction Analysis of Microarray (PAM). Conclusion: This normalization-free method is routinely used for quality control and correction of collection errors in patient reports to clinicians. It can be easily extended to multiple class prediction suitable with clinical groups, and looks particularly promising through international cooperative projects like the "Microarray Quality Control project of US FDA" MAQC as a predictive classifier for diagnostic, prognostic and response to treatment. Finally, it can be used as a powerful tool to mine published data generated on Affymetrix systems and more generally classify samples with binary feature values. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
35. Pores and Ridges: High-Resolution Fingerprint Matching Using Level 3 Features.
- Author
-
Jain, Anil K., Yi Chen, and Demirkus, Meltem
- Subjects
HUMAN fingerprints ,SCANNING systems ,DETECTORS ,NATIONAL security ,ANTHROPOMETRY ,COMPUTER algorithms ,COMPUTER graphics - Abstract
Fingerprint friction ridge details are generally described in a hierarchical order at three different levels, namely, Level 1 (pattern), Level 2 (minutia points), and Level 3 (pores and ridge contours). Although latent print examiners frequently take advantage of Level 3 features to assist in identification, Automated Fingerprint Identification Systems (AFIS) currently rely only on Level 1 and Level 2 features. In fact, the Federal Bureau of Investigation's (FBI) standard of fingerprint resolution for AFIS is 500 pixels per inch (ppi), which is inadequate for capturing Level 3 features, such as pores. With the advances in fingerprint sensing technology, many sensors are now equipped with dual resolution (500 ppi/1,000 ppi) scanning capability. However, increasing the scan resolution alone does not necessarily provide any performance improvement in fingerprint matching, unless an extended feature set is utilized. As a result, a systematic study to determine how much performance gain one can achieve by introducing Level 3 features in AFIS is highly desired. We propose a hierarchical matching system that utilizes features at all the three levels extracted from 1,000 ppi fingerprint scans. Level 3 features, including pores and ridge contours, are automatically extracted using Gabor filters and wavelet transform and are locally matched using the Iterative Closest Point (ICP) algorithm. Our experiments show that Level 3 features carry significant discriminatory information. There is a relative reduction of 20 percent in the equal error rate (EER) of the matching system when Level 3 features are employed in combination with Level 1 and 2 features. This significant performance gain is consistently observed across various quality fingerprint images. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
36. Convergent Validity of the ASAM Patient Placement Criteria Using a Standardized Computer Algorithm.
- Author
-
Staines, Graham, Kosanke, Nicole, Magura, Stephen, Bali, Priti, Foote, Jeffrey, and Deluca, Alexander
- Subjects
MEDICAL protocols ,CONTINUUM of care ,COMPUTER algorithms ,ALCOHOLISM treatment - Abstract
The study examined the convergent validity of the ASAM Patient Placement Criteria (PPC) by comparing Level of Care (LOC) recommendations produced by two alternative methods: a computer-driven algorithm and a "standard" clinical assessment. A cohort of 248 applicants for alcoholism treatment were evaluated at a multi-modality treatment center. The two methods disagreed (58% of cases) more often than they agreed (42%). The algorithm recommended a more intense LOC than the clinician protocol in 81% of the discrepant cases. Four categories of disagreement accounted for 97% of the discrepant cases. Several major sources of disagreement were identified and examined in detail: clinicians' reasoned departures from the PPC rules, conservatism in algorithm LOC recommendations, and measurement overlap between two specific dimensions. In order for the ASAM PPC and its associated algorithm to be embraced by treatment programs, the observed differences in LOC recommendations between the algorithm and "standard" clinical assessment should be resolved. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
37. Radical Pruning: A Method to Construct Skeleton Radial Basis Function Networks.
- Author
-
Augusteijn, Marijke F. and Shaw, Kelly A.
- Subjects
ARTIFICIAL neural networks ,COMPUTER algorithms ,MACHINE learning - Abstract
Trained radial basis function networks are well-suited for use in extracting rules and explanations because they contain a set of locally tuned units. However, for rule extraction to be useful, these networks must first be pruned to eliminate unnecessary weights. The pruning algorithm cannot search the network exhaustively because of the computational effort involved. It is shown that using multiple pruning methods with smart ordering of the pruning candidates, the number of weights in a radial basis function network can be reduced to a small fraction of the original number. The complexity of the pruning algorithm is quadratic (instead of exponential) in the number of network weights. Pruning performance is shown using a variety of benchmark problems from the University of California, Irvine machine learning database. [ABSTRACT FROM AUTHOR]
- Published
- 2000
- Full Text
- View/download PDF
38. Degrees, Designed by the NUMBERS.
- Author
-
PARRY, MARC
- Subjects
DATA mining ,INTERNET in higher education ,COMPUTER algorithms ,TECHNOLOGY ,INTERNET in education ,EDUCATIONAL technology ,COMPUTER assisted instruction ,MANAGEMENT science ,UNIVERSITIES & colleges - Abstract
The article discusses the use of data mining and computer-based algorithms to increase student productivity and success in U.S. higher education. It addresses attempts to reduce student attrition and improve graduation time among college students, as well as electronic monitoring of student progress to ensure they remain on track. It comments on the use of electronic assistance related to monitoring study habits, personalize course work, and anticipate success in an academic discipline. It also reports on the potential use of data mining for social aspects at universities, opposition to the programs, and considers issues of privacy and anonymity. INSET: A Conversation With 2 Developers of Personalized-Learning….
- Published
- 2012
39. Andrew Viterbi: The Key To Communications 40 Years Early.
- Author
-
Kilbane, Doris
- Subjects
ELECTRICAL engineers ,ELECTRICAL engineering ,COMPUTER algorithms ,AUTOMATIC speech recognition - Abstract
The article presents a biography of Andrew Viterbi, an electrical engineer and a businessman in the U.S. He was born in Bergamo, Italy and studied at the Massachusetts Institute of Technology. He earned his doctor of philosophy degree in 1962 and started searching a simple rule to explain the processing techniques in engineering and applied science. Eventually, he formulated the Viterbi algorithm which was widely used in error-correcting codes in mobile phones and speech recognition systems.
- Published
- 2006
40. Pricey OMS Vendor Attracts Attention.
- Author
-
Chapman, Peter
- Subjects
ELECTRONIC trading of securities ,COMPUTER algorithms ,PROGRAM trading (Securities) ,FINANCIAL markets ,DATA processing in securities ,ELECTRONIC commerce - Abstract
Reports on the computer algorithm for electronic trading of securities developed by LatentZero. Price of the Minerva order management system in relation to other competitors; Total number of clients using the Minerva system; Use of compliance engines to analyze orders to make sure they do not breach client and regulatory requirements.
- Published
- 2004
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.