34 results on '"Jim Austin"'
Search Results
2. A Digital Repository and Execution Platform for Interactive Scholarly Publications in Neuroscience.
- Author
-
Victoria J. Hodge, Mark Jessop, Martyn Fletcher, Michael Weeks 0001, Aaron Turner, Thomas W. Jackson, Colin Ingram, Leslie Smith, and Jim Austin
- Published
- 2016
- Full Text
- View/download PDF
3. Hadoop neural network for parallel and distributed feature selection.
- Author
-
Victoria J. Hodge, Simon O'Keefe, and Jim Austin
- Published
- 2016
- Full Text
- View/download PDF
4. A binary neural shape matcher using Johnson Counters and chain codes.
- Author
-
Victoria J. Hodge, Simon O'Keefe, and Jim Austin
- Published
- 2009
- Full Text
- View/download PDF
5. Neural network based pattern matching and spike detection tools and services - in the CARMEN neuroinformatics project.
- Author
-
Martyn Fletcher, Bojian Liang, Leslie Smith, Alastair Knowles, Thomas W. Jackson, Mark Jessop, and Jim Austin
- Published
- 2008
- Full Text
- View/download PDF
6. A binary neural decision table classifier.
- Author
-
Victoria J. Hodge, Simon O'Keefe, and Jim Austin
- Published
- 2006
- Full Text
- View/download PDF
7. A high performance k-NN approach using binary neural networks.
- Author
-
Victoria J. Hodge, Ken Lees, and Jim Austin
- Published
- 2004
- Full Text
- View/download PDF
8. Hierarchical word clustering - automatic thesaurus generation.
- Author
-
Victoria J. Hodge and Jim Austin
- Published
- 2002
- Full Text
- View/download PDF
9. A 65-nm CMOS Lossless Bio-Signal Compression Circuit With 250 FemtoJoule Performance Per Bit
- Author
-
Christopher Crispin-Bailey, Chenglaing Dai, and Jim Austin
- Subjects
Very-large-scale integration ,Lossless compression ,Computer science ,020208 electrical & electronic engineering ,Biomedical Engineering ,Signal compression ,Signal Processing, Computer-Assisted ,02 engineering and technology ,Integrated circuit ,Data Compression ,law.invention ,CMOS ,law ,Compression ratio ,0202 electrical engineering, electronic engineering, information engineering ,Electronic engineering ,Humans ,Electrical and Electronic Engineering ,Algorithms ,Data compression ,Data transmission - Abstract
A 65 nm CMOS integrated circuit implementation of a bio-physiological signal compression device is presented, reporting exceptionally low power, and extremely low silicon area cost, relative to state-of-the-art. A novel ‘xor-log2-sub-band’ data compression scheme is evaluated, achieving modest compression, but with very low resource cost. With the intent to design the 'simplest useful compression algorithm’, the outcome is demonstrated to be very favourable where power must be saved by trading off compression effort against data storage capacity, or data transmission power, even where more complex algorithms can deliver higher compression ratios. A VLSI design and fabricated Integrated Circuit implementation are presented, and estimated performance gains and efficiency measures for various bio-medical use-cases are given. Power costs as low as 1.2 pJ per sample-bit are suggested for a 10 kSa/s data-rate, whilst utilizing a power-gating scenario, and dropping to 250 fJ/bit at continuous conversion data-rates of 5 MSa/sec. This is achieved with a diminutive circuit area of 155 um2. Both power and area appear to be state-of-the-art in terms of compression versus resource cost, and this yields benefit for system optimization.
- Published
- 2019
10. A metric for pattern-matching applications to traffic management
- Author
-
Victoria J. Hodge, Jim Austin, Mike Smith, Richard Mounce, Garry Hollier, and Thomas Jackson
- Subjects
Engineering ,Traffic congestion reconstruction with Kerner's three-phase theory ,Similarity (geometry) ,business.industry ,Transportation ,Floating car data ,computer.software_genre ,Traffic flow ,Computer Science Applications ,Euclidean distance ,Automotive Engineering ,Metric (mathematics) ,Data mining ,Pattern matching ,Artificial intelligence ,business ,computer ,Traffic generation model ,Civil and Structural Engineering - Abstract
This paper considers signal plan selection; the main topic is the design of a system for utilising pattern matching to assist the timely selection of sound signal control plan changes. In this system, historical traffic flow data is continually searched, seeking traffic flow patterns similar to today’s. If, in one of these previous similar situations, (a) the signal plan utilised was different to that being utilised today and (b) it appears that the performance achieved was better than the performance likely to be achieved today, then the system recommends an appropriate signal plan switch. The heart of the system is “similarity”. Two traffic flow patterns (two time series of traffic flows arising from two different days) are said to be “similar” if the distance between them is small; similarity thus depends on how the metric or distance between two time series of traffic flows is defined. A simple example is given which suggests that utilising the standard Euclidean distance between the two sequences comprising cumulatives of traffic flow may be better than utilising the standard Euclidean distance between the original two sequences of traffic flow data. The paper also gives measured on-street public transport benefits which have arisen from using a simple rule-based (traffic-responsive) signal plan selection system, compared with a time-tabled signal plan selection system.
- Published
- 2013
- Full Text
- View/download PDF
11. A Machine-Learning Approach to Keypoint Detection and Landmarking on 3D Meshes
- Author
-
Clement Creusot, Nick Pears, and Jim Austin
- Subjects
business.industry ,Computer science ,Feature vector ,Pattern recognition ,Machine learning ,computer.software_genre ,Linear discriminant analysis ,Set (abstract data type) ,Artificial Intelligence ,Face (geometry) ,Pattern recognition (psychology) ,Computer vision ,Polygon mesh ,Computer Vision and Pattern Recognition ,Artificial intelligence ,AdaBoost ,business ,Focus (optics) ,computer ,Software - Abstract
We address the problem of automatically detecting a sparse set of 3D mesh vertices, likely to be good candidates for determining correspondences, even on soft organic objects. We focus on 3D face scans, on which single local shape descriptor responses are known to be weak, sparse or noisy. Our machine-learning approach consists of computing feature vectors containing $$D$$ different local surface descriptors. These vectors are normalized with respect to the learned distribution of those descriptors for some given target shape (landmark) of interest. Then, an optimal function of this vector is extracted that best separates this particular target shape from its surrounding region within the set of training data. We investigate two alternatives for this optimal function: a linear method, namely Linear Discriminant Analysis, and a non-linear method, namely AdaBoost. We evaluate our approach by landmarking 3D face scans in the FRGC v2 and Bosphorus 3D face datasets. Our system achieves state-of-the-art performance while being highly generic.
- Published
- 2013
- Full Text
- View/download PDF
12. A NOTE ON TWO APPLICATIONS OF LOGICAL MATCHING STRATEGY
- Author
-
Jim Austin, Elizabeth Sherly, and Sanil Shanker Kp
- Subjects
Matching (statistics) ,Artificial Intelligence ,Computer science ,Center (algebra and category theory) ,Data mining ,Pattern matching ,computer.software_genre ,Fuzzy logic ,computer - Abstract
This paper proposes Logical Matching Strategy for sequential pattern matching. We show the two real-world applications of the method: (1) locate repeating sequential pattern and (2) alignment-free comparison of sequential pattern of finite length using fuzzy membership values that generate automatically from the number of matches and mismatches. The results show the utility of the method by analyzing DNA sequences taken from the National Center for Biotechnology Information (NCBI) databank. The Logical Matching Strategy can possibly be applied to develop a method of research in sequental pattern matching.
- Published
- 2011
- Full Text
- View/download PDF
13. Three-dimensional face recognition using combinations of surface feature map subspace components
- Author
-
Thomas Heseltine, Jim Austin, and Nick Pears
- Subjects
business.industry ,Pattern recognition ,Linear discriminant analysis ,Facial recognition system ,Depth map ,Feature (computer vision) ,Face (geometry) ,Test set ,Signal Processing ,Three-dimensional face recognition ,Computer vision ,Computer Vision and Pattern Recognition ,Artificial intelligence ,business ,Subspace topology ,Mathematics - Abstract
In this paper, we show the effect of using a variety of facial surface feature maps within the Fishersurface technique, which uses linear discriminant analysis, and suggest a method of identifying and extracting useful qualities offered by each surface feature map. Combining these multi-feature subspace components into a unified surface subspace, we create a three-dimensional face recognition system producing significantly lower error rates than individual surface feature map systems tested on the same data. We evaluate systems by performing up to 1,079,715 verification operations on a large test set of 3D face models. Results are presented in the form of false acceptance and false rejection rates, generated by varying a decision threshold applied to a distance metric in surface space.
- Published
- 2008
- Full Text
- View/download PDF
14. U.S. Life Scientists Report Rising Salaries and High Job Satisfaction
- Author
-
Jim Austin
- Subjects
Inflation ,Labour economics ,Multidisciplinary ,media_common.quotation_subject ,General partnership ,Economics ,Job satisfaction ,Salary ,media_common ,Life Scientists - Abstract
The latest salary survey of AAAS life scientists, developed in partnership with Kelly Scientific Resources, finds that pay increases outpaced inflation, especially for postdocs. (Read more.)
- Published
- 2006
- Full Text
- View/download PDF
15. DAME: Searching Large Data Sets Within a Grid-Enabled Engineering Application
- Author
-
Jim Austin, Martyn Fletcher, Bojian Liang, Robert I. Davis, Thomas Jackson, Mark Jessop, and Andy Pasley
- Subjects
Computer science ,business.industry ,SIGNAL (programming language) ,Search engine technology ,computer.software_genre ,Grid ,Domain (software engineering) ,Grid computing ,Systems engineering ,Aircraft maintenance ,The Internet ,Data mining ,Pattern matching ,Electrical and Electronic Engineering ,business ,computer - Abstract
The use of search engines within the Internet is now ubiquitous. This work examines how Grid technology may affect the implementation of search engines by focusing on the Signal Data Explorer application developed within the Distributed Aircraft Maintenance Environment (DAME) project. This application utilizes advanced neural-network-based methods (Advanced Uncertain Reasoning Architecture (AURA) technology) to search for matching patterns in time-series vibration data originating from Rolls-Royce aeroengines (jet engines). The large volume of data associated with the problem required the development of a distributed search engine, where data is held at a number of geographically disparate locations. This work gives a brief overview of the DAME project, the pattern marching problem, and the architecture. It also describes the Signal Data Explorer application and provides an overview of the underlying search engine technology and its use in the aeroengine health-monitoring domain.
- Published
- 2005
- Full Text
- View/download PDF
16. A Survey of Outlier Detection Methodologies
- Author
-
Victoria J. Hodge and Jim Austin
- Subjects
Linguistics and Language ,Instrument error ,Computer science ,Human error ,computer.software_genre ,Language and Linguistics ,Data set ,Gamut ,Artificial Intelligence ,Outlier ,Anomaly detection ,Noise (video) ,Data mining ,computer - Abstract
Outlier detection has been used for centuries to detect and, where appropriate, remove anomalous observations from data. Outliers arise due to mechanical faults, changes in system behaviour, fraudulent behaviour, human error, instrument error or simply through natural deviations in populations. Their detection can identify system faults and fraud before they escalate with potentially catastrophic consequences. It can identify errors and remove their contaminating effect on the data set and as such to purify the data for processing. The original outlier detection methods were arbitrary but now, principled and systematic techniques are used, drawn from the full gamut of Computer Science and Statistics. In this paper, we introduce a survey of contemporary techniques for outlier detection. We identify their respective motivations and distinguish their advantages and disadvantages in a comparative review.
- Published
- 2004
- Full Text
- View/download PDF
17. Distribution forecasting of high frequency time series
- Author
-
Jim Austin and Andy Pasley
- Subjects
Information Systems and Management ,Profit (accounting) ,Artificial neural network ,Series (mathematics) ,Computer science ,Covariance matrix ,Nonparametric statistics ,computer.software_genre ,Management Information Systems ,Data set ,Distribution (mathematics) ,Arts and Humanities (miscellaneous) ,Developmental and Educational Psychology ,Probability distribution ,Data mining ,computer ,Information Systems ,Volume (compression) - Abstract
The availability of high frequency data sets in finance has allowed the use of very data intensive techniques using large data sets in forecasting. An algorithm requiring fast k-NN type search has been implemented using AURA, a binary neural network based upon Correlation Matrix Memories. This work has also constructed probability distribution forecasts, the volume of data allowing this to be done in a nonparametric manner. In assistance to standard statistical error measures the implementation of simulations has allowed actual measures of profit to be calculated.
- Published
- 2004
- Full Text
- View/download PDF
18. A comparison of standard spell checking algorithms and a novel binary neural approach
- Author
-
Victoria J. Hodge and Jim Austin
- Subjects
Matching (statistics) ,Artificial neural network ,business.industry ,Computer science ,InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL ,Supervised learning ,Hamming distance ,computer.software_genre ,Machine learning ,Spelling ,Computer Science Applications ,Computational Theory and Mathematics ,Benchmark (computing) ,Pattern matching ,Artificial intelligence ,business ,computer ,Algorithm ,Natural language processing ,Word (computer architecture) ,Information Systems - Abstract
In this paper, we propose a simple, flexible, and efficient hybrid spell checking methodology based upon phonetic matching, supervised learning, and associative matching in the AURA neural system. We integrate Hamming Distance and n-gram algorithms that have high recall for typing errors and a phonetic spell-checking algorithm in a single novel architecture. Our approach is suitable for any spell checking application though aimed toward isolated word error correction, particularly spell checking user queries in a search engine. We use a novel scoring scheme to integrate the retrieved words from each spelling approach and calculate an overall score for each matched word. From the overall scores, we can rank the possible matches. We evaluate our approach against several benchmark spellchecking algorithms for recall accuracy. Our proposed hybrid methodology has the highest recall rate of the techniques evaluated. The method has a high recall rate and low-computational cost.
- Published
- 2003
- Full Text
- View/download PDF
19. Tracheomalacia and bronchomalacia in children: pathophysiology, assessment, treatment and anaesthesia management
- Author
-
Jim Austin and Tariq Ali
- Subjects
medicine.medical_specialty ,Anesthesia, General ,law.invention ,Bronchoscopy ,law ,medicine ,Humans ,Child ,Intensive care medicine ,Tracheal Diseases ,Bronchial Diseases ,medicine.diagnostic_test ,business.industry ,Infant ,medicine.disease ,Intensive care unit ,Surgery ,Anesthesiology and Pain Medicine ,Tomography x ray computed ,Tracheomalacia ,Spirometry ,Treatment modality ,Tracheobronchomalacia ,Child, Preschool ,Pediatrics, Perinatology and Child Health ,Bronchomalacia ,Tomography, X-Ray Computed ,business ,Anesthesia, Local - Abstract
Tracheomalacia and bronchomalacia are becoming increasingly well recognized. Although pathologically benign conditions, they are responsible for considerable morbidity, occasional mortality and significant difficulties in the operating theatre and intensive care unit. We performed an extensive literature search to identify causal associations, methods of clinical and investigative assessment, treatment modalities and anaesthetic experience with these conditions.
- Published
- 2003
- Full Text
- View/download PDF
20. A comparison of a novel neural spell checker and standard spell checking algorithms
- Author
-
Victoria J. Hodge and Jim Austin
- Subjects
Matching (statistics) ,Recall ,Artificial Intelligence ,Computer science ,Signal Processing ,Supervised learning ,Spell ,Computer Vision and Pattern Recognition ,Lexicon ,Algorithm ,Software ,Word (computer architecture) ,Spelling - Abstract
In this paper, we propose a simple and flexible spell checker using efficient associative matching in the AURA modular neural system. Our approach aims to provide a pre-processor for an information retrieval (IR) system allowing the user's query to be checked against a lexicon and any spelling errors corrected, to prevent wasted searching. IR searching is computationally intensive so much so that if we can prevent futile searches we can minimise computational cost. We evaluate our approach against several commonly used spell checking techniques for memory-use, retrieval speed and recall accuracy. The proposed methodology has low memory use, high speed for word presence checking, reasonable speed for spell checking and a high recall rate.
- Published
- 2002
- Full Text
- View/download PDF
21. Hierarchical word clustering — automatic thesaurus generation
- Author
-
Jim Austin and Victoria J. Hodge
- Subjects
Text corpus ,Thesaurus (information retrieval) ,Vocabulary ,Artificial neural network ,Synonym ,business.industry ,Computer science ,Cognitive Neuroscience ,media_common.quotation_subject ,Machine learning ,computer.software_genre ,Computer Science Applications ,ComputingMethodologies_PATTERNRECOGNITION ,Artificial Intelligence ,Synonym (database) ,Artificial intelligence ,Cluster analysis ,business ,computer ,Word (computer architecture) ,Natural language processing ,media_common ,Abstraction (linguistics) - Abstract
In this paper, we propose a hierarchical, lexical clustering neural network algorithm that automatically generates a thesaurus (synonym abstraction) using purely stochastic information derived from unstructured text corpora and requiring no prior word classifications. The lexical hierarchy overcomes the Vocabulary Problem by accommodating paraphrasing through using synonym clusters and overcomes Information Overload by focusing search within cohesive clusters. We describe existing word categorisation methodologies, identifying their respective strengths and weaknesses and evaluate our proposed approach against an existing neural approach using a benchmark statistical approach and a human generated thesaurus for comparison. We also evaluate our word context vector generation methodology against two similar approaches to investigate the effect of word vector dimensionality and the effect of the number of words in the context window on the quality of word clusters produced. We demonstrate the effectiveness of our approach and its superiority to existing techniques.
- Published
- 2002
- Full Text
- View/download PDF
22. A science career story
- Author
-
Jim Austin
- Subjects
Employment ,Pride ,Liberal arts education ,Multidisciplinary ,business.industry ,media_common.quotation_subject ,Science ,Media studies ,Eastern Bloc ,Engineering physics ,Assistant professor ,Ideal (ethics) ,Career Mobility ,Publishing ,Chapel ,Wife ,Sociology ,business ,computer ,media_common ,computer.programming_language - Abstract
I was valedictorian of a large public high school in Florida, attended a top liberal arts college—Swarthmore—and majored in physics. After a short, post-college stint as a small-town journalist, I entered the physics Ph.D. program at the University of North Carolina, Chapel Hill. I connected early with a research project that allowed me to publish often and well, although I did not love the work. I finished my Ph.D. fairly quickly—faster than I needed to really because my wife, a chemist, was still in graduate school. So I stayed in the same lab for a postdoc, doing the same work, funding the position with a grant proposal written by me and submitted in my adviser's name. From graduate school on, I do not remember receiving a single piece of career advice. A year or so into my postdoc, my adviser retired. I took over his lab, earning a very long title: visiting research assistant professor. I inherited some '80s-vintage electronics and a '60s-era lab with a desk in the corner. ![Figure][1] I appeared successful. I was running my own funded lab and publishing in good journals. ILLUSTRATION: MARC ROSENTHAL As my wife approached the end of her Ph.D., we began to consider our “two-body problem.” We agreed that we would accept the first good offer either of us received. My job-market timing could not have been worse. The dissolution of the Soviet Union and the Eastern bloc sent many physicists and other scientists streaming west. Big corporate labs were downsizing and moving away from basic research, sending veteran physicists onto the academic job market. The early '90s employment crisis among young Ph.D. physicists made news. (You can read about it in Science at .) I appeared successful. I was running my own funded lab and publishing in good journals. But it wasn't long before I realized that I wasn't competitive for tenure-track positions at the institutions where I wanted to work, including the one where I was already working. One rejection letter among the many I received thanked me for being part of a “remarkable cohort” of more than a thousand applicants for a single faculty post. I believe the number actually exceeded 1300. My wife applied for just one job and got the offer. I became the trailing spouse, following her to Maine, where she took up a faculty post. I took pride in defying science's gender stereotypes. It was surprisingly easy to walk away from the career that I had worked so long and hard to attain. The hard part came later as I lost my knowledge of science and saw my mathematical facility dwindle, and as I struggled to fashion an identity that wasn't linked to professional attainment. I turned to writing, and when I wasn't writing, I was repairing and maintaining a passive-solar house in the country: shoveling snow, hauling tons of firewood up a steep hill, planting a vegetable garden that didn't get enough sun, and fulfilling various back-to-nature cliches. For a while, I taught writing part-time to undergraduates; it was ideal training for the work that would come later. We soon had a son, and (except for the breastfeeding part) I was the primary parent. In 1999 I founded an Internet publication, The Post-Careerist, which was focused on living a rich post-professional life. Through connections I made online, I became science editor at a pioneering Internet publication, BlueEar. Then, via an acquaintance in Pakistan—this was early online networking, before LinkedIn and Facebook—I heard about a writer/editor position at Science 's Next Wave, Science Careers' predecessor. I sent an e-mail and within weeks found myself with a full-time job for the first time in years. A few years later, I became the editor of Science Careers. My science career story is hardly unusual. Indeed, what's remarkable is how much it shares with so many other nontraditional career stories: uncertainty, exploration, a difficult transition, self-invention, and (eventually) satisfaction. So what's your story? Send stories, perspectives, opinions, and observations on careers in the sciences to me at SciCareerEditor{at}aaas.org. [1]: pending:yes
- Published
- 2014
23. Hierarchical growing cell structures: TreeGCS
- Author
-
Jim Austin and Victoria J. Hodge
- Subjects
Artificial neural network ,business.industry ,Computer science ,Single-linkage clustering ,Dendrogram ,Stability (learning theory) ,Pattern recognition ,computer.software_genre ,Network topology ,Computer Science Applications ,Hierarchical clustering ,Euclidean distance ,Computational Theory and Mathematics ,Canopy clustering algorithm ,Unsupervised learning ,Artificial intelligence ,Data mining ,Hierarchical clustering of networks ,business ,Cluster analysis ,Image retrieval ,computer ,Astrophysics::Galaxy Astrophysics ,Information Systems - Abstract
We propose a hierarchical clustering algorithm (TreeGCS) based upon the Growing Cell Structure (GCS) neural network of B. Fritzke (1993). Our algorithm refines and builds upon the GCS base, overcoming an inconsistency in the original GCS algorithm, where the network topology is susceptible to the ordering of the input vectors. Our algorithm is unsupervised, flexible, and dynamic and we have imposed no additional parameters on the underlying GCS algorithm. Our ultimate aim is a hierarchical clustering neural network that is both consistent and stable and identifies the innate hierarchical structure present in vector-based data. We demonstrate improved stability of the GCS foundation and evaluate our algorithm against the hierarchy generated by an ascendant hierarchical clustering dendogram. Our approach emulates the hierarchical clustering of the dendogram. It demonstrates the importance of the parameter settings for GCS and how they affect the stability of the clustering.
- Published
- 2001
- Full Text
- View/download PDF
24. Learning criteria for training neural network classifiers
- Author
-
Jim Austin and Ping Zhou
- Subjects
Mean squared error ,Artificial neural network ,business.industry ,Computer science ,Pattern recognition ,Function (mathematics) ,Perceptron ,Machine learning ,computer.software_genre ,ComputingMethodologies_PATTERNRECOGNITION ,Cross entropy ,Artificial Intelligence ,Outlier ,Radial basis function ,Artificial intelligence ,business ,computer ,Software - Abstract
This paper presents a study of two learning criteria and two approaches to using them for training neural network classifiers, specifically a Multi-Layer Perceptron (MLP) and Radial Basis Function (RBF) networks. The first approach, which is a traditional one, relies on the use of two popular learning criteria, i.e. learning via minimising a Mean Squared Error (MSE) function or a Cross Entropy (CE) function. It is shown that the two criteria have different charcteristics in learning speed and outlier effects, and that this approach does not necessarily result in a minimal classification error. To be suitable for classification tasks, in our second approach an empirical classification criterion is introduced for the testing process while using the MSE or CE function for the training. Experimental results on several benchmarks indicate that the second approach, compared with the first, leads to an improved generalisation performance, and that the use of the CE function, compared with the MSE function, gives a faster training speed and improved or equal generalisation performance.
- Published
- 1998
- Full Text
- View/download PDF
25. Graph matching by neural relaxation
- Author
-
Mick Turner and Jim Austin
- Subjects
Artificial neural network ,Matching (graph theory) ,Computational complexity theory ,Artificial Intelligence ,Feature (computer vision) ,3-dimensional matching ,Relaxation (iterative method) ,Graph theory ,Relaxation labelling ,Algorithm ,Software ,Mathematics - Abstract
We propose a new relaxation scheme for graph matching in computer vision. The main distinguishing feature of our approach is that matching is formulated as a process of eliminating unlikely candidates rather than finding the best match directly. Bayesian development leads to a robust algorithm which can be implemented in a fast and efficient manner on a neural network architecture. We illustrate the utility of the technique through comparisons with its conventional counterpart on simulated and real-world data.
- Published
- 1998
- Full Text
- View/download PDF
26. CARMEN: an e-science virtual laboratory supporting collaboration in neuroinformatics
- Author
-
Colin D. Ingram, Jim Austin, Leslie S. Smith, and Paul Watson
- Subjects
Computer science ,business.industry ,Storage Resource Broker ,General Neuroscience ,lcsh:QP351-495 ,Neuroinformatics ,Cloud computing ,Data science ,lcsh:RC321-571 ,World Wide Web ,Data sharing ,Metadata ,Upload ,Cellular and Molecular Neuroscience ,Data visualization ,lcsh:Neurophysiology and neuropsychology ,e-Science ,business ,lcsh:Neurosciences. Biological psychiatry. Neuropsychiatry - Abstract
Studies of neural networks and the processes they controlfrequently employ recording techniques to determinetemporal patterns of activity within individual neuronsand their interactions. Neuroinformatics is the rapidlygrowing science that addresses the manipulation andanalysis of the vast volumes of data generated from suchtechniques. However, although these data are often diffi-cult and expensive to produce, they are rarely shared andcollaboratively exploited, and dissemination of new anal-ysis methods may be restricted by issues of software andfile compatibility. CARMEN (Code Analysis, Repositoryand Modeling for e-Neuroscience) aims to address theseissues by creating an environment for handling time seriesdata and for deploying analysis algorithms using distrib-uted computing technology.The CARMEN infrastructure builds heavily on softwaredeveloped in previous e-science projects. The "Cloud"architecture allows the co-location of data and computa-tion (avoiding the need to repeatedly transfer large quan-tities of data) and enabling users to conduct their sciencethrough a web browser. The data handling capabilities ofthe CARMEN portal have recently been deployed, ena-bling registration of users and upload of data files. Theprimary data consists mainly of files of electrophysiologi-cal data, for which we use Storage Resource Broker tomanage the distributed store. To provide a description ofexperimental protocols, an extensible metadata schemahas been developed [1] and implemented using templatesto avoid the necessity of re-entering values for commonprotocols. A security layer enables the contributor to con-trol access rights to both the data and metadata, so thatthe originator and collaborators can share and analyze thedata in a private environment until publication when thedata may be made public. This repository satisfies therequirements of funding agencies to make research outputpublicly available and provides a resource for computa-tional neuroscientists.The project consortium is developing new analysis meth-ods including spike detection services that use waveletand morphology techniques [2], a spike sorting method-ology that extends WaveClus [3], information theoreticanalysis and Bayesian network analysis to determinecausal relations, and algorithms for resolving spike syn-chrony. An associated thick client tool, Signal DataExplorer, provides data visualization, signal processingand pattern matching capabilities. Because analysis appli-cations need to be executed on a wide range of data for-mats, we have specified a uniform file and formatstructure for data sharing and communication betweenapplications [4]. We are implementing an enactmentengine to enable linking of applications into more com-plex and user-defined workflows.
- Published
- 2009
27. Book Reviews : We Who Would Take No Prisoners: Report of the Fifth International Conference on Prison Abolition Edited by Brian D. MacLean and Harold E. Pepinsky Vancouver, Canada: Collective Press, 1991, 103 pages
- Author
-
Jim Austin
- Subjects
Political science ,media_common.quotation_subject ,Media studies ,Prison ,Criminology ,Law ,media_common - Published
- 1994
- Full Text
- View/download PDF
28. Reviewing the reviewers
- Author
-
Jim Austin
- Subjects
Male ,medicine.medical_specialty ,Multidisciplinary ,History ,business.industry ,Writing ,Public health ,Comparative literature ,Linguistics ,Minor (academic) ,Public relations ,Sex Factors ,medicine ,Humans ,Female ,Women ,business - Abstract
Anna Kaatz was working on a Ph.D. in comparative literature, studying old medical texts and public health codes, when her major professor was jailed and fired for sending sexually explicit material to a detective posing as a minor. Forced to abandon the project, Kaatz found her way to the then-new
- Published
- 2014
- Full Text
- View/download PDF
29. What it takes
- Author
-
Jim Austin
- Subjects
Male ,Value (ethics) ,Multidisciplinary ,Impact factor ,business.industry ,Science ,media_common.quotation_subject ,Sexism ,Public relations ,Odds ,Career Mobility ,Incentive ,Ranking ,San Francisco Declaration on Research Assessment ,Institution ,Humans ,Female ,Journal Impact Factor ,business ,Publication ,media_common - Abstract
Not long ago, Science Careers posted a widget—you can find it at —that lets early-career scientists calculate the probability that they'll someday become principal investigators (PIs), on the basis of a few standard publication metrics. The widget builds on work published 2 June in Current Biology ( ) in which the study's three authors—all early-career computational biologists—used PubMed data to study the influence of some 200 factors on academic scientists' career trajectories. Built by John Bohannon, a contributing writer for Science Careers, and David van Dijk of the Weizmann Institute of Science in Israel (the lead author of the Current Biology study), our widget employed a simplified model based on the same data set. ![Figure][1] First-author publications count far more than middle-author publications. ILLUSTRATION: ROBERT NEUBECKER The Science Careers widget is less accurate than the full-bore model, but it has the virtue of focusing attention on a handful of the most important parameters. Just enter values for a few mostly familiar metrics, and the widget displays a graph of PI probability—the probability of eventually occupying the last-author position on a peer-reviewed article—versus the parameter of your choosing. That makes it useful for savvy early-career scientists planning their ascents to independence. If they use it that way, they'll learn the following lessons: 1. Be male. The widget's probability plot displays two lines: red for women and blue for men. The blue line is above the red line across the whole range of probabilities, no matter what variable you display on the ordinate. In the scenario I ran, a woman needed two extra first-author publications or seven extra middle-author publications to reach the same probability of becoming a PI as a man with an otherwise identical record. 2. Be selfish. Do you value collaboration? Too bad: In my testing, an extra first-author publication increased the odds of becoming a PI by 17%; it would take eight middle-author publications to get a comparable boost. So, think twice before giving away that prized first-author slot. 3. Be elite. According to our widget, the institution you train at doesn't matter much—unless you're at one of the top 10 universities in the Academic Ranking of World Universities. Then, it matters a lot. In one scenario, moving from the number one institution (Harvard University) to the number two institution (Stanford University) decreased the chance of becoming a PI significantly more than moving from the 10th institution in the ranking (the University of Oxford) to the 100th (the University of Freiburg). 4. Publish in journals with high impact factors. If you've signed the San Francisco Declaration on Research Assessment ( ), or you just don't think a metric designed for a journal should be used to evaluate individual scientists, avert your eyes: Journal impact factor has a strong influence on a scientist's probability of attaining PI-ship. It's apples and oranges, but to the extent that a comparison can be made, the influence of journal impact factor seems stronger than that of either the number of citations your most cited article has received or of h-index, which is meant to measure a scientist's productivity and impact. You might say, then, that the impact factor of the journals you publish in matters more than your own personal impact factor. None of these results is terribly surprising, but they are more than a little depressing. These four factors, all of which were found to be among the most important ingredients of academic career success, are at best indirectly linked—and at worst not linked at all—to rigorous, serious, and significant science. The real value of the PI-predictor widget, then, is not that it can help early-career scientists plan their careers; in fact, we should hope they don't use it that way. Its real value, rather, is that it so clearly demonstrates the wide gap between science's ideals and incentives. If we want young scientists to remain idealistic, then we need to figure out how to do a better job rewarding the things that really matter: discovery (often as part of a team) and solutions to society's most compelling problems. [1]: pending:yes
- Published
- 2014
- Full Text
- View/download PDF
30. United States: Two Scientists and a Baby
- Author
-
Jim Austin
- Subjects
Multidisciplinary ,ComputerSystemsOrganization_MISCELLANEOUS ,Law ,Conventional wisdom ,Sociology ,ComputingMilieux_MISCELLANEOUS - Abstract
If you trust the conventional wisdom, Amy Palmer and Alexis Templeton did a lot of things wrong in their job search. Then why did things turn out so right?
- Published
- 2005
- Full Text
- View/download PDF
31. Getting on the Grid: Bringing Remote, Intermittent Energy Sources Into Line
- Author
-
Jim Austin
- Subjects
Multidisciplinary ,business.industry ,Electrical engineering ,Environmental science ,Bedout ,Line (text file) ,business ,Grid ,Energy source ,Energy (signal processing) ,Renewable energy - Abstract
NISKAYUNA, NEW YORK-- Juan de Bedout of GE Global Research works to integrate renewable energy sources--which are often intermittent and far from where the energy is needed--into an electrical-power grid that prefers a steady supply. ([Read more][1].) [1]: http://www.sciencemag.org/cgi/content/full/315/5813/870
- Published
- 2007
- Full Text
- View/download PDF
32. Energy for the Long Haul
- Author
-
Barbara R. Jasny, Brooks Hanson, Phil Szuromi, Jim Austin, and Daniel Clery
- Subjects
Multidisciplinary ,business.industry ,Fossil fuel ,Energy consumption ,Environmental economics ,Renewable energy ,Alternative energy ,Economics ,media_common.cataloged_instance ,Kyoto Protocol ,Electricity ,European union ,business ,Renewable resource ,media_common - Abstract
[Figure][1] CREDIT: DAVID PAUL MORRIS/GETTY Perhaps the greatest challenge in realizing a sustainable future is energy consumption. It is ultimately the basis for a large part of the global economy, and more of it will be required to raise living standards in the developing world. Today, we are mostly dependent on nonrenewable fossil fuels that have been and will continue to be a major cause of pollution and climate change. Because of these problems, and our dwindling supply of petroleum, finding sustainable alternatives is becoming increasingly urgent. This special issue focuses on some of the challenges and efforts needed to harness renewable energy more effectively at a sufficient scale to make a difference and some of the people who are working on these problems. As introduced in the first News article (p. 782), the Editorial by Holdren (p. [737][2]), and the Perspective by Whitesides and Crabtree (p. [796][3]), many of the outstanding questions require major research efforts in underfunded areas. Much of the focus on sustainable energy is aimed at different ways of tapping into the most abundant renewable resource: solar energy. Lewis (p. [798][4]) points out that the direct conversion of sunlight with solar cells, either into electricity or hydrogen, faces cost hurdles independent of their intrinsic efficiency. Ways must be found to lower production costs and design better conversion and storage systems. In the short term, utilization of biomass relies mainly on sugar fermentation; Goldemberg (p. 808) discusses how Brazil's use of ethanol from sugarcane has greatly reduced its need for imported oil. Many long-term goals have been set for biomass utilization; for example, the European Union (EU) hopes to produce a quarter of its transportation fuels from biomass by 2030, as discussed by Himmel et al. (p. [804][5]). Better ways are also needed for processing the available sugars, and conversion to higher alcohols or even alkanes is desirable. Stephanopoulos (p. 801) explores the options afforded by reengineering biosynthetic pathways in microbes. How we tackle energy problems will turn on a number of policy issues. Poto ![Figure][1] nik (p. 810) discusses how the EU is setting targets and allocating funding for alternative energy. Finally, Schrag (p. [812][6]) explores the feasibility of sequestering carbon dioxide from fossil-fuel use and our technological readiness and willingness to implement such schemes. The News section profiles national lab directors, computer modelers, captains of industry, and bench scientists who are writing the early chapters of the next book on energy research. Some of them are developing better plants to grow as fuel or ways to convert them into ethanol. Others are developing catalysts to extract hydrogen from water or generate electricity from hydrogen. What they all share is a desire to find new ways to power the future. [ScienceCareers.org][7] takes a look at three young private-sector scientists who are on their first steps to careers in energy R&D: a consultant helping Israel meet its obligations under the Kyoto Protocol, a former particle physicist designing solar energy systems, and a Ph.D.-level engineer integrating sustainable electricity supplies into the power grid. [1]: pending:yes [2]: /lookup/doi/10.1126/science.1139792 [3]: /lookup/doi/10.1126/science.1140362 [4]: /lookup/doi/10.1126/science.1137014 [5]: /lookup/doi/10.1126/science.1137016 [6]: /lookup/doi/10.1126/science.1137632 [7]: http://sciencecareers.org/
- Published
- 2007
- Full Text
- View/download PDF
33. CARMEN: Code analysis, Repository and Modeling for e-Neuroscience
- Author
-
Bojian Liang, Mark Jessop, Jim Austin, Martyn Fletcher, Thomas Jackson, Leslie S. Smith, Michael Weeks, Colin Ingram, and Paul Watson
- Subjects
Computer science ,workflow ,Static program analysis ,02 engineering and technology ,computer.software_genre ,World Wide Web ,software as a service ,03 medical and health sciences ,0302 clinical medicine ,E-Science ,0202 electrical engineering, electronic engineering, information engineering ,Web application ,General Environmental Science ,virtual collbaoration ,business.industry ,Software as a service ,020207 software engineering ,computer.file_format ,Virtual machine ,e-Science ,General Earth and Planetary Sciences ,Executable ,business ,Neuroscience ,computer ,030217 neurology & neurosurgery - Abstract
The CARMEN (Code, Analysis, Repository and Modelling for e-Neuroscience) system [1] provides a web based portal platform through which users can share and collaboratively exploit data, analysis code and expertise in neuroscience. The system has been beendeveloped in the UK and currently supports 200 hundred neuroscientists working in a Virtual Environment with an initial focus on electrophysiology data. The proposal here is that the CARMEN system provides an excellent base from which to develop an ‘executable paper’ system. CARMEN has been built by York and Newcastle Universities and is based on over 10 years experience in the construction of eScience based distributed technology. CARMEN started four years ago involving 20 scientific investigators (neuroscientists and computer scientists) at 11 UK Universities (www.CARMEN.org.uk). The project is supported for another 4 years at York and Newcastle, along with a sister project to take the underlying technology and pilot it as a UK platform for supporting the sharing of research outputs in a generic way. An entirely natural extension to the CARMEN system would be its alignment with a publications repository. The CARMEN system is operational on the domain https://portal.CARMEN.org.uk, where it is possible to request a login to try out the system.
- Full Text
- View/download PDF
34. A Simple Method for the Filing of Microfilm Records in Short Length Strips
- Author
-
Jim Austin and Harold P. Brown
- Subjects
Multidisciplinary ,law ,Computer science ,Simple (abstract algebra) ,STRIPS ,Short length ,Algorithm ,law.invention - Published
- 1939
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.