396 results on '"W. Albrecht"'
Search Results
2. VASCULAR PLANT AND VERTEBRATE INVENTORY OF CASA GRANDE RUINS NATIONAL MONUMENT
- Author
-
Brian F. Powell, Eric W. Albrecht, Cecilia A. Schmidt, Pamela Anning, and Kathleen Docherty
- Published
- 2023
- Full Text
- View/download PDF
3. Accelerating Radius-Margin Parameter Selection for SVMs Using Geometric Bounds
- Author
-
David W. Albrecht, Ben Goodrich, and Peter Tischer
- Subjects
Support vector machine ,Mathematical optimization ,Normalization property ,Two parameter ,Optimization problem ,Iterative method ,Estimation theory ,Other information and computing sciences not elsewhere classified ,Ball (bearing) ,Upper and lower bounds ,Mathematics - Abstract
Support Vector Machine (SVM) parameter selection has previously been performed by minimizing the ratio of the radius of the Minimal Enclosing Ball (MEB) enclosing the training data compared to the margin of the SVM. By considering the geometric properties of the SVM and MEB optimization problems, we show that upper and lower bounds on the radius-margin ratio of an SVM can be efficiently computed at any point during training. We use these bounds to accelerate radius-margin parameter selection by terminating training routines as early as possible, while still obtaining a guarantee that the parameters minimize the radius-margin ratio. Once an SVM has been partially trained on any set of parameters, we also show that these bounds can be used to evaluate and possibly reject neighboring parameter values with little or no additional training required. Empirical results show that this process can reduce the number of training iterations required in order to perform model selection by a factor of 10 or more, while suffering no loss of precision in minimizing the radius-margin ratio.
- Published
- 2022
- Full Text
- View/download PDF
4. Towards Detecting Inactivity Using an In-Home Monitoring System
- Author
-
David W. Albrecht, Masud Moshtaghi, Ingrid Zukerman, and R. Andrew Russell
- Subjects
Risk analysis (engineering) ,Remote patient monitoring ,Computer science ,Other information and computing sciences not elsewhere classified ,Forensic engineering ,Statistical model ,Monitoring system ,World population - Abstract
The ageing of the world population underscores the need for in-home monitoring systems that help elderly people remain safely in their homes. This paper introduces a statistical model that addresses one of the main concerns of elderly people and their carers, viz detection of safe movement in the house. Our model identifies unusually long periods of inactivity within different regions in a home using non-intrusive sensors. Our evaluation on two real-life datasets yields encouraging results.
- Published
- 2022
- Full Text
- View/download PDF
5. Therapeutic potentials associated with biological properties of Juniper berry oil (Juniperus communis L.) and its therapeutic use in several diseases – A Review
- Author
-
Uwe W Albrecht and Ahmed Madisch
- Subjects
Food Science - Abstract
Juniperus communis L. is a plant that belongs to the Cupressaceae family. It grows as either a shrub or small tree and is widely distributed across the Northern Hemisphere, including northern Europe, Asia, and America. The berries are an efficient source of several bioactive structures. This review article will focus on the current status of the therapeutic use of juniper berry essential oil, which is presently indicated as a herbal medicinal treatment for dyspepsia. Interest in plant-based medicinal products is growing, and therefore it is important that accessible, up-to-date research is available to patients. Many plants are a natural source of therapeutic structures and can therefore often provide an alternative to synthetic pharmacology. A main constituent of juniper berry oil is α-pinene, a highly active structure which has been shown in in vitro and in vivo studies to possess several biological activities. This review sums up the available reports and indications which describe the function and value of juniper berry essential oil and especially, the constituent α-pinene as a potential candidate in several disorders and inflammatory conditions.Keywords: Juniperus communis, dyspepsia, juniper berry oil, Antioxidant activity, Antibacterial activity, Anti-inflammatory activity
- Published
- 2022
- Full Text
- View/download PDF
6. Isolation-based anomaly detection using nearest-neighbor ensembles
- Author
-
Ye Zhu, David W. Albrecht, Fei Tony Liu, Jonathan R. Wells, Tharindu Bandaragoda, and Kai Ming Ting
- Subjects
business.industry ,Computer science ,Pattern recognition ,02 engineering and technology ,Ensemble learning ,k-nearest neighbors algorithm ,Computational Mathematics ,Artificial Intelligence ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Anomaly detection ,Artificial intelligence ,Isolation (database systems) ,business - Published
- 2018
- Full Text
- View/download PDF
7. Quality Controlled Region-Based PartialFingerprint Recognition
- Author
-
Nandita Bhattacharjee, Bala Srinivasan, Omid Zanganeh, David W. Albrecht, and Komal Komal
- Subjects
Computer science ,business.industry ,Communication ,media_common.quotation_subject ,Media Technology ,Quality (business) ,Pattern recognition ,Artificial intelligence ,Similarity measure ,business ,Industrial and Manufacturing Engineering ,media_common - Published
- 2018
- Full Text
- View/download PDF
8. Tragic Drama in the Golden Age of Spain: Seven Essays on the Definition of a Genre by Henry W. Sullivan
- Author
-
Jane W. Albrecht
- Subjects
Literature ,business.industry ,media_common.quotation_subject ,General Medicine ,Art ,business ,media_common ,Drama - Published
- 2019
- Full Text
- View/download PDF
9. Fluor's Econamine FG PlusSM Completes Test Program at Uniper's Wilhelmshaven Coal Power Plant
- Author
-
W. Albrecht, J. Yonkoski, Satish Reddy, Robin Irons, and Helmut Rode
- Subjects
Engineering ,Waste management ,Power station ,business.industry ,020209 energy ,02 engineering and technology ,Process configuration ,Energy consumption ,Hard coal ,Test program ,0202 electrical engineering, electronic engineering, information engineering ,General Earth and Planetary Sciences ,Demonstration Plant ,Coal ,business ,Coal power plant ,General Environmental Science - Abstract
Fluor and Uniper collaborated to jointly build a demonstration carbon dioxide recovery plant. The capture plant is based on Fluor's Econamine FG Plus SM technology and is located at Uniper's hard coal power plant in Wilhelmshaven, Germany. The demonstration plant, which has a CO 2 capture capacity of 70 t/d, utilizes a process configuration that incorporates several recent technology enhancements that can be directly applied to a full sized plant. In late 2015, the project successfully completed a 3 year test campaign. The test program generated useful data related to energy consumption, emissions profiles, and other information to facilitate the scale-up of the process to a full-sized coal based power plant.
- Published
- 2017
- Full Text
- View/download PDF
10. 'Volved los ojos, y vello': Staging Comedia Corpses
- Author
-
Jane W. Albrecht
- Published
- 2017
- Full Text
- View/download PDF
11. ZERO++: Harnessing the Power of Zero Appearances to Detect Anomalies in Large-Scale Data Sets
- Author
-
Kai Ming Ting, Guansong Pang, David W. Albrecht, and Huidong Jin
- Subjects
Computer science ,Zero (complex analysis) ,02 engineering and technology ,computer.software_genre ,Space (mathematics) ,Linear subspace ,Artificial Intelligence ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Artificial Intelligence & Image Processing ,020201 artificial intelligence & image processing ,Anomaly detection ,Data mining ,Anomaly (physics) ,computer ,Algorithm ,Time complexity ,Categorical variable ,Subspace topology - Abstract
© 2016 AI Access Foundation. This paper introduces a new unsupervised anomaly detector called ZERO++ which employs the number of zero appearances in subspaces to detect anomalies in categorical data. It is unique in that it works in regions of subspaces that are not occupied by data; whereas existing methods work in regions occupied by data. ZERO++ examines only a small number of low dimensional subspaces to successfully identify anomalies. Unlike existing frequencybased algorithms, ZERO++ does not involve subspace pattern searching. We show that ZERO++ is better than or comparable with the state-of-the-art anomaly detection methods over a wide range of real-world categorical and numeric data sets; and it is efficient with linear time complexity and constant space complexity which make it a suitable candidate for large-scale data sets.
- Published
- 2016
- Full Text
- View/download PDF
12. Reviews of Books
- Author
-
Ian Mackenzie, Maria Boguszewicz, Kimberly A. Nance, Don W. Cruickshank, Christopher J. Henstock, Grace Magnier, Eberhard Geisler, John A. Jones, Jane W. Albrecht, Ruth Sánchez Imizcoz, Brígida M. Pastor, Julia Biggane, Richard Baxell, Peter Anderson, Sally Faulkner, Carlota Larrea, Avi Astor, Samuel Amago, Jesse Barker, Mercedes Carbayo Abengózar, Ronald W. Sousa, David F. Slade, Susan Elizabeth Ramírez, Nicola Miller, Daniel Nappo, and Timothy D. Wilson
- Subjects
Cultural Studies ,Literature and Literary Theory - Published
- 2016
- Full Text
- View/download PDF
13. Los cigarrales de la privanza y el mecenazgo en Tirso de Molina by Juan Pablo Gil-Osle
- Author
-
Jane W. Albrecht
- Subjects
General Medicine - Published
- 2017
- Full Text
- View/download PDF
14. Findings from machine learning in clinical medical imaging applications – Lessons for translation to the forensic setting
- Author
-
Carlos A. Peña-Solórzano, Matthew Richard Dimmock, David W. Albrecht, Richard Bassed, and Michael D. Burke
- Subjects
Diagnostic Imaging ,Support Vector Machine ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Machine learning ,computer.software_genre ,Bone and Bones ,Article ,Pathology and Forensic Medicine ,Medical imaging ,Humans ,Clinical imaging ,Lung ,Foreign Bodies ,Training set ,business.industry ,Brain ,Forensic Medicine ,Forensic radiology ,Clinical medicine ,Key (cryptography) ,Neural Networks, Computer ,Artificial intelligence ,business ,Law ,computer ,Algorithms ,CT ,MRI - Abstract
Highlights • Machine learning techniques are currently not widely used in addressing forensic imaging problems. • Machine learning applications can be transferred from clinical to forensic settings. • Convolutional neural networks are outperforming other machine learning techniques in image processing tasks. • There is a dearth of quantitative metrics presented for inter-study comparisons. • A wider use of histopathological gold standard metrics for assessing algorithm performance is required., Machine learning (ML) techniques are increasingly being used in clinical medical imaging to automate distinct processing tasks. In post-mortem forensic radiology, the use of these algorithms presents significant challenges due to variability in organ position, structural changes from decomposition, inconsistent body placement in the scanner, and the presence of foreign bodies. Existing ML approaches in clinical imaging can likely be transferred to the forensic setting with careful consideration to account for the increased variability and temporal factors that affect the data used to train these algorithms. Additional steps are required to deal with these issues, by incorporating the possible variability into the training data through data augmentation, or by using atlases as a pre-processing step to account for death-related factors. A key application of ML would be then to highlight anatomical and gross pathological features of interest, or present information to help optimally determine the cause of death. In this review, we highlight results and limitations of applications in clinical medical imaging that use ML to determine key implications for their application in the forensic setting.
- Published
- 2020
- Full Text
- View/download PDF
15. Semi-supervised labelling of the femur in a whole-body post-mortem CT database using deep learning
- Author
-
Matthew Richard Dimmock, J. Gillam, David W. Albrecht, Richard Bassed, Carlos A. Peña-Solórzano, and Peter C. Harris
- Subjects
0301 basic medicine ,medicine.medical_treatment ,Knee replacement ,Health Informatics ,03 medical and health sciences ,Deep Learning ,0302 clinical medicine ,medicine ,Whole Body Imaging ,Femur ,Segmentation ,QA ,Mathematics ,T1 ,Orientation (computer vision) ,business.industry ,Autoencoder ,Sagittal plane ,Computer Science Applications ,030104 developmental biology ,medicine.anatomical_structure ,Coronal plane ,Autopsy ,Implant ,Tomography, X-Ray Computed ,Nuclear medicine ,business ,030217 neurology & neurosurgery - Abstract
A deep learning pipeline was developed and used to localize and classify a variety of implants in the femur contained in whole-body post-mortem computed tomography (PMCT) scans. The results provide a proof-of-principle approach for labelling content not described in medical/autopsy reports. The pipeline, which incorporated residual networks and an autoencoder, was trained and tested using n = 450 full-body PMCT scans. For the localization component, Dice scores of 0.99, 0.96, and 0.98 and mean absolute errors of 3.2, 7.1, and 4.2 mm were obtained in the axial, coronal, and sagittal views, respectively. A regression analysis found the orientation of the implant to the scanner axis and also the relative positioning of extremities to be statistically significant factors. For the classification component, test cases were properly labelled as nail (N+), hip replacement (H+), knee replacement (K+) or without-implant (I−) with an accuracy >97%. The recall for I− and H+ cases was 1.00, but fell to 0.82 and 0.65 for cases with K+ and N+. This semi-automatic approach provides a generalized structure for image-based labelling of features, without requiring time-consuming segmentation.
- Published
- 2020
- Full Text
- View/download PDF
16. Parameter Recovery Using Radon Transform
- Author
-
Komal Komal, Bala Srinivasan, David W. Albrecht, and Nandita Bhattacharjee
- Subjects
Radon transform ,Computational complexity theory ,Computer science ,business.industry ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Image processing ,02 engineering and technology ,01 natural sciences ,Transformation (function) ,Image alignment ,0103 physical sciences ,Pixel based ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Computer vision ,Artificial intelligence ,Day to day ,010306 general physics ,business - Abstract
In computer vision, alignment of images plays an important role in different day to day life applications by bringing similar points of images into correspondence. Conventionally, images are aligned by using the extracted features of images. However, the extraction of features is difficult, time consuming and not suitable for low quality images. Whereas pixel based alignment methods are applicable to low quality images and do not require extensive image processing applications. However, these methods are computationally exhaustive that makes it difficult to use for real time applications. Therefore, we need a pixel based method for image alignment that can overcome the limitations of computational complexity. In this paper, we have proposed an efficient Radon transform based method to compute the transformation parameters between two images for aligning them. Moreover, the proposed approach is verified on test images of varying classes and results demonstrate that the proposed method can compute the parameters accurately and efficiently as compared to the existing approaches.
- Published
- 2018
- Full Text
- View/download PDF
17. Hot-Melt Adhesives
- Author
-
Maynard Lawrence, Beth M. Eichler, Cynthia L. Rickey, Steven W. Albrecht, Thomas F. Kauffman, Thomas H. Quinn, and Robert A. Dubois
- Subjects
chemistry.chemical_classification ,Paperboard ,Wax ,Ethylene ,Materials science ,Polymer ,Tackifier ,Viscosity ,chemistry.chemical_compound ,Hot-melt adhesive ,chemistry ,visual_art ,visual_art.visual_art_medium ,Adhesive ,Composite material - Abstract
Disclosed are hot melt adhesives comprising at least one first homogeneous linear or substantially linear ethylene polymer having a particular density and melt viscosity at 350°F (177 °C), and an optional wax and tackifier. In particular, disclosed is a hot melt adhesive characterized by: a) at least one homogeneous linear or substantially linear interpolymer of ethylene with at least one C3-C20α-olefin interpolymer having a density from 0.850 g/cm3 to 0.895 g/cm3; and b) optionally at least one tackifying resin; and c) optionally at least one wax wherein the hot melt adhesive has a viscosity of less than about 5000 cPs (50 grams/(cm.second)) at 150 °C. Preferred hot melt adhesives for use in adhering cardboard or paperboard are disclosed, as well as the resultant adhered products. Also disclosed is a dual reactor process for the preparation of the inventive hot melt adhesives.
- Published
- 2018
- Full Text
- View/download PDF
18. Toddler Techie Touch Generation
- Author
-
David W. Albrecht, Mark Wesley Power, and Kirsten Ellis
- Subjects
InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI) ,Computer science ,05 social sciences ,050301 education ,Interaction design ,Human–computer interaction ,User group ,Unintended Movement ,Selection (linguistics) ,Domain knowledge ,0501 psychology and cognitive sciences ,Toddler ,0503 education ,050107 human factors - Abstract
This paper reports on a study of the interaction skills of forty-two children, between the ages of eighteen months to forty-two months, in using touch devices. A majority of the children had used a touch device previously and had prior experience with touch devices. Continuous swiping, discrete touching and directional swiping were found to be the easiest actions to complete. The drag interaction was more difficult but most children could complete the interaction. The pinch, stretch and rotate interactions were most difficult for the children to make successfully. Common errors included unintended movement during interactions, pressing too hard, and lack of precision due in part to the target size. This study expands the domain knowledge about a toddler's ability to interact with touch devices, allowing better creation and selection of interfaces for them to use.
- Published
- 2018
- Full Text
- View/download PDF
19. Development of a simple numerical model for trabecular bone structures
- Author
-
Richard Bassed, Matthew Richard Dimmock, Chris Hall, Carlos A. Peña-Solórzano, David W. Albrecht, David M. Paganin, and Peter C. Harris
- Subjects
Materials science ,Standard deviation ,Synthetic data ,030218 nuclear medicine & medical imaging ,03 medical and health sciences ,0302 clinical medicine ,Bone Density ,medicine ,Image Processing, Computer-Assisted ,Humans ,Projection (set theory) ,Australian Synchrotron ,Absorption (electromagnetic radiation) ,Phantoms, Imaging ,Ulna ,R735 ,Numerical Analysis, Computer-Assisted ,General Medicine ,Radius ,R1 ,medicine.anatomical_structure ,Distribution (mathematics) ,030220 oncology & carcinogenesis ,Cancellous Bone ,Tomography, X-Ray Computed ,Algorithms ,Biomedical engineering - Abstract
Purpose Advances in additive manufacturing processes are enabling the fabrication of surrogate bone structures for applications including use in high-resolution anthropomorphic phantoms. In this research, a simple numerical model is proposed that enables the generation of microarchitecture with similar statistical distribution to trabecular bone. Methods A human humerus, radius, ulna, and several vertebrae were scanned on the Imaging and Medical beamline at the Australian Synchrotron and the proposed numerical model was developed through the definition of two complex functions that encode the trabecular thickness and position-dependant spacing to generate volumetric surrogate trabecular structures. The structures reproduced those observed at 19 separate axial locations through the experimental bone volumes. The applicability of the model when incorporating a two-material approximation to absorption- and phase-contrast CT was also investigated through simulation. Results The synthetic structures, when compared with the real trabecular microarchitecture, yielded an average mean thickness error of 2 μm, and a mean difference in standard deviation of 33 μm for the humerus, 24 μm for the ulna and radius, and 15 μm for the vertebrae. Simulated absorption- and propagation-based phase contrast CT projection data were generated and reconstructed using the derived mathematical simplifications from the two-material approximation, and the phase-contrast effects were successfully demonstrated. Conclusions The presented model reproduced trabecular distributions that could be used to generate phantoms for quality assurance and validation processes. The implication of utilizing a two-material approximation results in simplification of the additive manufacturing process and the generation of synthetic data that could be used for training of machine learning applications.
- Published
- 2018
20. Modeling Failures in Water Mains Using the Minimum Monthly Antecedent Precipitation Index
- Author
-
David W. Albrecht, Jayantha Kodikara, and Li Chik
- Subjects
Mains electricity ,Antecedent (logic) ,0208 environmental biotechnology ,Geography, Planning and Development ,Poisson process ,02 engineering and technology ,Management, Monitoring, Policy and Law ,020801 environmental engineering ,symbols.namesake ,Covariate ,Statistics ,0202 electrical engineering, electronic engineering, information engineering ,symbols ,Environmental science ,020201 artificial intelligence & image processing ,Precipitation index ,Water Science and Technology ,Civil and Structural Engineering - Abstract
This paper examines the performance of the minimum monthly antecedent precipitation index (MMAPI) and other covariates in modeling pipe failures using the nonhomogeneous Poisson process (NH...
- Published
- 2018
- Full Text
- View/download PDF
21. iOOBN: A Bayesian Network Modelling Tool Using Object Oriented Bayesian Networks with Inheritance
- Author
-
Samiullah, Kevin B. Korb, Ann E. Nicholson, David W. Albrecht, and Thao Xuan Hoang
- Subjects
0106 biological sciences ,Structure (mathematical logic) ,business.industry ,Computer science ,Bayesian network ,02 engineering and technology ,Construct (python library) ,Reuse ,Machine learning ,computer.software_genre ,01 natural sciences ,Inheritance (object-oriented programming) ,Scalability ,0202 electrical engineering, electronic engineering, information engineering ,Programming paradigm ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,computer ,010606 plant biology & botany ,Reusability - Abstract
The construction of Bayesian Networks (BNs) to model large-scale real-life problems is challenging. One approach to scaling up is Object Oriented Bayesian Networks (OOBNs). These provide modellers with the ability to define classes and construct models with a compositional and hierarchical structure, enabling reuse and supporting maintenance. In the OO programming paradigm, a key concept is inheritance, the ability to derive attributes and behavior from pre-existing classes, which enables an even higher level of reusability and scalability. However, inheritance in OOBNs has yet to be fully defined and implemented. Here we present iOOBN, a tool which provides fully defined inheritance for OOBNs. We provide guidance on modelling in iOOBN, describe our prototype implementation with an existing BN software tool, Hugin, and demonstrate its applicability and usefulness via a case study of re-engineering an existing large complex dynamic OOBN.
- Published
- 2017
- Full Text
- View/download PDF
22. Estimation of the Short-Term Probability of Failure in Water Mains
- Author
-
Jayantha Kodikara, Li Chik, and David W. Albrecht
- Subjects
Estimation ,Engineering ,Mains electricity ,Receiver operating characteristic ,business.industry ,education ,0208 environmental biotechnology ,Geography, Planning and Development ,Bayesian probability ,02 engineering and technology ,010501 environmental sciences ,Management, Monitoring, Policy and Law ,Expected value ,01 natural sciences ,020801 environmental engineering ,Term (time) ,Ranking ,Statistics ,Covariate ,business ,0105 earth and related environmental sciences ,Water Science and Technology ,Civil and Structural Engineering - Abstract
In this paper, the nonhomogeneous Poisson process (NHPP), hierarchical beta process (HBP), and a new-developed Bayesian simple model (BSM) are used for short-term failure prediction with several water utility failure data sets. The prediction curve was used to investigate the relative ranking of the pipes from the three models. The curve was demonstrated to be more suitable for practical situations compared to the traditional receiver operating characteristic (ROC) curve. The expected number of failures and the effect of incorporating covariates by dividing the pipe assets into smaller pressure cohorts were also examined. Based on the data sets used, the observations from the prediction curves show that the performance of the three models are very similar in terms of pipe ranking. However, the BSM is more advantageous because of its relative simplicity. The covariate, the number of known past failures, plays a very important role when considering the relative ranking of the pipes in the network. T...
- Published
- 2017
- Full Text
- View/download PDF
23. Public-Sector Employment in an Equilibrium Search and Matching Model
- Author
-
James W. Albrecht, Monica Robayo-Abril, and Susan B. Vroman
- Published
- 2017
- Full Text
- View/download PDF
24. Are Two Better Than One in Rheumatoid Arthritis? Development of Dual p38α MAPK/PDE-4 Inhibitors for Treatment of TNFα-Driven Diseases
- Author
-
S.M. Bauer, W. Albrecht, and S.A. Laufer
- Subjects
030203 arthritis & rheumatology ,0301 basic medicine ,MAPK/ERK pathway ,business.industry ,Kinase ,p38 mitogen-activated protein kinases ,Pharmacology ,medicine.disease ,03 medical and health sciences ,030104 developmental biology ,0302 clinical medicine ,Rheumatoid arthritis ,Medicine ,Tumor necrosis factor alpha ,Kinome ,Receptor ,business ,Protein kinase A - Abstract
Selective p38 mitogen-activated protein kinase (MAPK) inhibition has failed as a molecular principle in several phase II studies in rheumatoid arthritis, perhaps due to high redundancy with several bypass mechanisms (e.g., via c-Jun N-terminal kinases) in the kinome signaling involved in the release of tumor necrosis factor alpha (TNFα). We investigated compounds covering several MAPKs and selected a candidate based on its efficacy to inhibit TNFα release from human whole blood. Subsequently, we extended compound profiling to all ATP-like structures that bind to enzymes and receptors and thereby identified the dual p38α MAPK/phosphodiesterase 4 (PDE-4) inhibition as a mechanism to inhibit TNFα levels. Data from in vitro, ex vivo, and in vivo as well as phase I studies of the candidate CBS-3595 indicate that inhibition of both p38α MAPK and PDE-4 synergistically optimizes the antiinflammatory response.
- Published
- 2017
- Full Text
- View/download PDF
25. Transformational Approach for Alignment-free Image Matching Applications
- Author
-
Nandita Bhattacharjee, Komal Komal, Bala Srinivasan, and David W. Albrecht
- Subjects
Radon transform ,Pixel ,Computer science ,Image matching ,business.industry ,Computation ,Computer vision and image processing ,020206 networking & telecommunications ,Pattern recognition ,02 engineering and technology ,Robustness (computer science) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,Process time ,Precision and recall ,business - Abstract
One of the fundamental steps in computer vision and image processing applications is the alignment of images before matching. A number of alignment methods based on features points, pixel information and transformation parameters have been proposed in the literature. All these methods either suffer from accurate detection of feature points or are computationally expensive that makes the entire matching process time consuming. The fast matching of images performed without alignment can benefit a number of image matching applications. In this paper, a transformational approach is proposed that identifies whether two images are similar or not without performing an alignment. The matching without alignment reduces the computation time of the entire matching process. The effectiveness of the proposed approach has been demonstrated on UMD logo dataset. The robustness of the proposed approach to additive noise shows that high accuracy can be achieved even when the noise is as high as 20%. Furthermore, the computation time is improved by decreasing the number of projections without affecting the matching accuracy. The experiments demonstrate that the proposed method is not only efficient but also highly accurate as compared to existing approaches.
- Published
- 2017
- Full Text
- View/download PDF
26. Sampling of a continuous flow in the plane based on two-dimensional variogram models
- Author
-
Thaung Lwin and David W. Albrecht
- Subjects
Statistics and Probability ,Mathematical optimization ,Plane (geometry) ,Applied Mathematics ,Autocorrelation ,Sampling (statistics) ,Function (mathematics) ,Euclidean distance ,Discrete time and continuous time ,Statistics::Methodology ,Applied mathematics ,Spatial variability ,Statistics, Probability and Uncertainty ,Variogram ,Mathematics - Abstract
Theory of sampling from one-dimensional continuous material flows is well established in developments following the original path breaking work by P. Gy, based on an extensively employed tool named variogram and its modelling. These developments could be regarded as a continuous analogue of the discrete time series approach in one dimensional sampling due to W. G. Cochran, whose work was further unified with the variogram approach by G.H. Jowett. Extension of Cochran’s discrete time series approach to plane sampling, from equally spaced grid locations in two dimensions, was carried out, independently, by M. H. Quenouille and R. Das. Gy’s development is an off-shoot of a closely related geostatistical theory, pioneered by G. Matheron, for collecting and analysing data in three dimensional space. Geostatistical approach generally employed a variogram based on a Euclidean distance to provide a three dimensional modelling of spatial variation structures. Most of the successful variogram modelling approach is essentially based on geometrically isotropic one-dimensional variogram models, instead of variogram models based on generalised distance functions or distance functions in separate dimensions. Quenouille explicitly proposed a geometrically anisotropic two-dimensional autocorrelation function which could be readily reparameterised to obtain a two-dimensional elliptical variogram. Both plane sampling approach and geostatistical approach provide techniques of assessing the uncertainty of the sample average as an estimate of the overall population mean, by the respective techniques of two-dimensional discrete sampling and spatial variogram modelling. The present paper (i) outlines the connection between the precision formula based on plane sampling approach and that based on traditional geostatistical approach, via a unified framework in which the two methods can be compared on an equal footing, with or without equal spacing along the X -direction and/or the Y -direction (ii) proposes an elliptical empirical variogram model, as a two-dimensional exponential type variogram, obtained as a re-parameterised version of Quenouille’s elliptical auto-correlation function and (iii) provides a computationally robust algorithm for fitting a two-dimensional elliptical variogram model to an observed variogram in two dimensions by an approximate likelihood method, in addition to a demonstration of the developed methodology to a data set, available in the published literature.
- Published
- 2014
- Full Text
- View/download PDF
27. There’s No Cure for That: Melezina and Mimetic Transference in Fuenteovejuna
- Author
-
Jane W. Albrecht
- Subjects
Philosophy ,Judaism ,media_common.quotation_subject ,Honor ,Identity (social science) ,Proposition ,General Medicine ,Theology ,Relation (history of concept) ,Peasant ,Persecution ,media_common ,Wonder - Abstract
At the wedding celebration for Laurencia and Frondoso in Lope de Vega's Fuenteovejuna, the shepherd Mengo complains that the musicians have put little effort into their song.1 Frondoso teases Mengo, whose buttocks are still raw from a flogging by the sadistic Comendador's men, that surely he is more knowledgeable about whips than verses. As Mengo begins his retort excoriating the Comendador, another shepherd, Barrildo, interrupts and begs him not to mention the man: "No lo digas, por tu vida, / que este barbaro homicida / a todos quita el honor" (1484-86). Mengo agrees and, after recalling the agony of the lashing, brings up a worse case:MENGO. Pero que le hayan echadouna melezina a un hombre,que, aunque no dire su nombre,todos saben que es honrado,llena de tinta y de chinas,?como se puede sufrir?BARRILDO. Harialo por reir.MENGO. No hay risas con melezinas,que aunque es cosa saludable...Yo me quiero morir luego. (1491-500)The lexical, syntactic, and thematic parallelism of their assertions welds nameless tyrant to anonymous victim, and hangs the villager's identity on honor. Mengo's speech is designed to advocate for the victim, who is worthy of his community's esteem, and efface the Comendador, who is responsible for abusing him and the other townspeople. The verses are nonetheless enigmatic: the victim goes unnamed, leading us to wonder whether he might have been Mengo himself. That the word melezina comes in three guises in this short exchange compounds the ambiguity: it is a cure ("aunque es cosa saludable"), a prank ("harialo por reir"), and a form of rough treatment ("llena de tinta y de chinas").2 While the adulterated-enema punishment is represented in literature (additional examples are offered below), none of the dictionaries or reference works that I have seen acknowledge Mengo s claim that melezina was a form of persecution, further destabilizing an already slipper)' text.3The origin and usage of the key word melezina and its sociolinguistic features elucidate Mengo's remarks. They point to the proposition that melezina resonated as an archaic and Jewish word which, in turn, suggests that Fuenteovejuna might enact mimetic transference between Jew and peasant. I stake out this position based on existing scholarship on Spain's "Jewish problem" in relation to the comedia. Taking her cues from Americo Castro's and others' provocative comments on the relationship between limpieza de sangre and honor, Melveena McKendrick postulates that a process of mimetic transference was underway that involved caballeros' honor, women's purity, and society's obsession with limpieza de sangre. She characterizes it as a type of psychopathology, according to which one disease masks another; playwrights who could not write about blood purity addressed sexual pureness and male honor instead. More recently, Andrew Herskovits explores the transpositions of honor and limpieza de sangre, and Jew and woman, and the possibility that a positive image of the Jew emerges when play texts draw parallels between a converso, on the one hand, and a female victim of the honor code on the other. Empathy explains them both. That is, what concern one elicits, the spectator or reader reassigns to the other (117-25). The notion of transference involving peasant and Jew may be operative in Fuenteovejuna. But in this instance, humor, not empathy, is behind the parallel. To make the case that melezina extends the concept of mimetic transference to admit the figure of the peasant and the role of humor, I begin by examining the history and usage of the word.The spelling alternated between melezina and melecina. Rafael Lapesa suggests that in Lope de Vega's day there was not, generally speaking, a difference in pronunciation between them, although pockets may have persisted where older forms held on (190-91); Cervantes, in particular, did not distinguish between c, z, and c before i and e, pronounced as a voiceless dental sibilant (Eisenberg 8). …
- Published
- 2014
- Full Text
- View/download PDF
28. Die neue EU-Spirituosenverordnung und ihr neues Schutzsystem für geografische Angaben im Spirituosensektor und somit auch für Spirituosen weinbaulichen Ursprungs (z.B. Cognac)
- Author
-
W. Albrecht
- Subjects
Environmental Engineering ,lcsh:QP1-981 ,lcsh:Zoology ,lcsh:QR1-502 ,lcsh:QL1-991 ,Humanities ,lcsh:Microbiology ,lcsh:Physiology ,Industrial and Manufacturing Engineering - Abstract
Die neue Spirituosen-Verordnung (EU) 2019/787 regelt auch künftig produktspezifisch den Schutz von eingetragenen geografischen Angaben für Spirituosen mit Ursprung in der EU und in Drittstaaten (sog. Geoschutz), bringt jedoch eine Reihe von Änderungen mit sich, die weitgehend dem horizontalen Geoschutzsystem für sonstige Agrarprodukte und Lebensmittel, das in der Verordnung (EU) Nr. 1151/2012 verankert, entlehnt sind. So werden nicht nur die Begriffe „Produktspezifikation“ und „Einziges Dokument“ übernommen, sondern auch das zweistufige Eintragungsverfahren sowie die Vorschrift, dass nur noch private Schutzvereinigungen das Recht haben, Anträge auf Eintragung von g.A. zu stellen. Geografische Angaben sind bereits seit 1. Januar 1996 im WTO-Übereinkommen über handelsbezogene Aspekte des geistigen Eigentums (TRIPS) geschützt und sind seit 2015 durch die Genfer Akte auch in der Lissabonner Verbandsübereinkunft, die von der WIPO verwaltet wird, neben den geschützten und kontrollierten Ursprungsbezeichnungen als qualifizierte Herkunftsbezeichnungen international anerkannt.
- Published
- 2019
- Full Text
- View/download PDF
29. A Region-based Alignment-free Partial Fingerprint Matching
- Author
-
Komal Komal, Bala Srinivasan, David W. Albrecht, and Nandita Bhattacharjee
- Subjects
021110 strategic, defence & security studies ,Matching (statistics) ,Similarity (geometry) ,Biometrics ,Computer science ,business.industry ,Data_MISCELLANEOUS ,Fingerprint (computing) ,0211 other engineering and technologies ,Fingerprint Verification Competition ,Pattern recognition ,02 engineering and technology ,Similarity measure ,Fingerprint recognition ,Metric (mathematics) ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Artificial intelligence ,business - Abstract
Fingerprint is one of the most researched biometrics and applied to many applications like boarder security, access security, security of electronic devices to name a few. However, alignment and matching of partial fingerprints convincingly is still a challenging problem. This paper proposes a region based partial fingerprint matching approach that matches the fingerprints without aligning them. The approach focuses on extracting the matching regions in unaligned fingerprints. As extraction of regions happens without alignment, a new similarity metric is proposed that can compute similarity value between unaligned fingerprints. The results demonstrate that the proposed approach can eliminate the need for aligning the fingerprints without affecting the accuracy. The popular Fingerprint Verification Competition (FVC) 2002 dataset is used for conducting experiments. Moreover, the results of fingerprint recognition are compared with the best alignment based fingerprint matching techniques.
- Published
- 2016
- Full Text
- View/download PDF
30. Multiplexed NMR: An Automated CapNMR Dual-Sample Probe
- Author
-
Timothy L. Peck, Robert W. Albrecht, John J. Likos, Craig T. Milling, Claude Jones, Anthony Audrieth, Ke Ruan, Dean L. Olson, Duanxiang Xu, and James A. Norcross
- Subjects
Magnetic Resonance Spectroscopy ,Spectrometer ,business.industry ,Chemistry ,Analytical chemistry ,Shim (magnetism) ,Multiplexing ,Automation ,Article ,Analytical Chemistry ,Data acquisition ,Spectral resolution ,business ,Computer hardware ,Digital signal processing ,Microscale chemistry - Abstract
A new generation of micro-scale, nuclear magnetic resonance (CapNMR™) probe technology employs two independent detection elements to accommodate two samples simultaneously. Each detection element in the Dual-Sample CapNMR Probe (DSP) delivers the same spectral resolution and S/N as in a CapNMR probe configured to accommodate one sample at a time. A high degree of electrical isolation allows the DSP to be used in a variety of data acquisition modes. Both samples are shimmed simultaneously to achieve high spectral resolution for simultaneous data acquisition, or alternatively, a flowcell-specific shim set is readily called via spectrometer subroutines to enable acquisition from one sample while the other is being loaded. An automation system accommodates loading of two samples via dual injection ports on an autosampler and two completely independent flowpaths leading to dedicated flowcells in the DSP probe.
- Published
- 2010
- Full Text
- View/download PDF
31. Differential effects of p38MAP kinase inhibitors on the expression of inflammation-associated genes in primary, interleukin-1β-stimulated human chondrocytes
- Author
-
Stefan Laufer, Rolf E. Brenner, W Albrecht, and Helga Joos
- Subjects
Pharmacology ,Microarray analysis techniques ,Inflammation ,Biology ,Matrix metalloproteinase ,Molecular biology ,Chondrocyte ,In vitro ,medicine.anatomical_structure ,In vivo ,Cell culture ,Gene expression ,medicine ,Cancer research ,medicine.symptom - Abstract
Background and purpose: A main challenge in the therapy of osteoarthritis (OA) is the development of drugs that will modify the disease. Reliable test systems are necessary to enable an efficient screening of therapeutic substances. We therefore established a chondrocyte-based in vitro cell culture model in order to characterize different p38MAPK inhibitors. Experimental approach: Interleukin-1β (IL-1β)-stimulated human OA chondrocytes were treated with the p38MAPK inhibitors Birb 796, pamapimod, SB203580 and the new substance CBS-3868. Birb 796- and SB203580-treated cells were analysed in a genome-wide microarray analysis. The efficacy of all inhibitors was characterized by quantitative gene expression analysis and the quantification of PGE2 and NO release. Key results: Microarray analysis revealed inhibitor-specific differences in gene expression. Whereas SB203580 had a broad effect on chondrocytes, Birb 796 counteracted the IL-1β effect more specifically. All p38MAPK inhibitors significantly inhibited the IL-1β-induced gene expression of COX-2, mPGES1, iNOS, matrix metalloproteinase 13 (MMP13) and TNFRSF11B, as well as PGE2 release. Birb 796 and CBS-3868 showed a higher efficacy than SB203580 and pamapimod at inhibiting the expression of COX-2 and MMP13 genes, as well as PGE2 release. In the case of mPGES1 and TNFRSF11B gene expression, CBS-3868 exceeded the efficacy of Birb 796. Conclusions and implications: Our test system could differentially characterize inhibitors of the same primary pharmaceutical target. It reflects processes relevant in OA and is based on chondrocytes that are mainly responsible for cartilage degradation. It therefore represents a valuable tool for drug screening in between functional in vitro testing and in vivo models.
- Published
- 2010
- Full Text
- View/download PDF
32. Identifying error and maintenance intervention of pavement roughness time series with minimum message length inference
- Author
-
Jay Sanjayan, David W. Albrecht, and Matthew Byrne
- Subjects
Measure (data warehouse) ,Engineering ,Series (mathematics) ,business.industry ,Inference ,Regression analysis ,Surface finish ,Regression ,Minimum message length ,Mechanics of Materials ,Segmentation ,business ,Algorithm ,Simulation ,Civil and Structural Engineering - Abstract
Pavement roughness is a useful measure of pavement condition. One method of comparing alternative sections of pavements is the roughness progression rate (RPR). The objective of this paper is to describe difficulties in identifying RPR from real data and provide a new type of criteria to overcome these difficulties. Selecting appropriate regression functions for time-series roughness presents two major problems. Roughness time series can include roughness data that appear erroneous, acting independent of the observed time-series trend. Including likely error values will bias the calculated RPR. The problem of identifying likely error is made more difficult with the possibility of maintenance intervention, which may reduce the roughness level and/or progression rate. A minimum message length (MML) criterion to select RPR is introduced and is referred to herein as MML RPR. We perform simulated comparisons of common segmentation criterion and conclude that MML RPR is the preferred criterion.
- Published
- 2010
- Full Text
- View/download PDF
33. Recognizing Patterns in Seasonal Variation of Pavement Roughness Using Minimum Message Length Inference
- Author
-
Jayantha Kodikara, Matthew Byrne, David W. Albrecht, and Jay Sanjayan
- Subjects
Inference ,Surface finish ,Seasonality ,medicine.disease ,Spatial distribution ,Computer Graphics and Computer-Aided Design ,Measure (mathematics) ,User input ,Computer Science Applications ,Minimum message length ,Variation (linguistics) ,Computational Theory and Mathematics ,Statistics ,medicine ,Civil and Structural Engineering ,Mathematics - Abstract
Pavement roughness is a common measure of pavement condition regularly measured by road au- thorities. An approach to recognize patterns of seasonal variation in rural sealed granular pavement roughness by minimum message length (MML) inference is demon- strated in this article. MML solves two fundamental ques- tions: First, is the seasonal variation a systematic pattern or merely the result of random scatter? Second, given ev- idence of seasonal variation to what level of complex- ity should the seasonal trend be modeled? The MML technique developed does not require user input rather will identify in a quantitative and consistent manner any patterns evident in the data. The patterns identified with MML can be used to remove seasonal variation effects. The analysis utilized 104,188 roughness values obtained from a particular region in Australia over 15 years. MML inference recognized patterns of seasonal variation and demonstrated that these are not merely due to random scatter. The optimum model selected by MML inference has four separate segments of variation. These segments correspond to changes in climatic conditions that support the inference.
- Published
- 2009
- Full Text
- View/download PDF
34. CME for neurosurgeons in the Netherlands: the 'quality' conferences
- Author
-
R. W. Koot, J. A. Grotenhuis, D. J. Zeilstra, J. J. A. Mooij, K. W. Albrecht, Michiel J. Staal, and Faculteit der Geneeskunde
- Subjects
Quality Control ,medicine.medical_specialty ,Medical education ,Quality Assurance, Health Care ,business.industry ,media_common.quotation_subject ,Neurosurgery ,Clinical Neurology ,Congresses as Topic ,Neurosurgical Procedures ,Surgery ,Meta-Analysis as Topic ,Perception and Action [DCN 1] ,Medicine ,Education, Medical, Continuing ,Quality (business) ,Neurology (clinical) ,business ,Netherlands ,media_common - Abstract
In 1993 the Netherlands Society for Neurosurgery started a yearly event, a "Quality Conference", specifically devoted to continuous medical education (CME). These conferences differ from "normal" scientific meetings, in the choice for specific topics, in the preparation with inquiries among all the Dutch neurosurgical centres, and in the way the results of these inquiries are discussed, preceded by lectures concerning the chosen topic by guest faculty and Dutch neurosurgeons. Each year's principal guest delivers the "Beks Lecture", named after the former professor in Neurosurgery in Groningen, Jan Beks. On several occasions, the foreign guests suggested to present this format for a larger neurosurgical forum. Therefore, it was decided to describe the various aspects of this format for CME in the Netherlands in a paper for Acta Neurochirugica. Examples of topics are given, a summary of two recent inquiries are presented and discussed, and the way of organizing such a conference including finance and the obligatory character are described.
- Published
- 2009
35. Identifying the Effects of Soil and Climate Types on Seasonal Variation of Pavement Roughness Using MML Inference
- Author
-
David W. Albrecht, Jayantha Kodikara, Jay Sanjayan, and Matthew Byrne
- Subjects
Tree (data structure) ,Mathematical model ,Bayesian information criterion ,Statistics ,Pavement management ,Inference ,Segmentation ,Akaike information criterion ,Computer Science Applications ,Civil and Structural Engineering ,Mathematics ,Minimum message length - Abstract
Pavement roughness is a common measure of pavement distress and one regularly measured by road authorities. While permanent pavement deterioration that equates to increased roughness is commonly modeled, cyclical or seasonal variations are often not included. While these variations may be small, they may be important when alternate pavements are compared directly for performance. We propose that seasonal variation may be described by partitioning the data into groups that are modeled as a segmentation problem. We developed a minimum message length (MML) segmentation tree (MMLST) criterion for partitioning and segmentation of the data. We performed simulated comparisons comparing common segmentation criterion (MMLST, maximum likelihood, Akaike information criterion, and Bayesian information criterion) and conclude that MMLST is the preferred criterion. MMLST assists in answering the following questions. First, is the observed segmentation pattern due to seasonal variation or merely random scatter? Second, ...
- Published
- 2008
- Full Text
- View/download PDF
36. A NEW SENSITIZATION EFFECT OF PHOTOCONDUCTIVITY IN THE SYSTEM POLY-N-VINYL-CARBAZOLE/ACCEPTOR/DYE*
- Author
-
W. Albrecht, H. Meier, and U. Tschirwitz
- Subjects
chemistry.chemical_classification ,Materials science ,Absorption spectroscopy ,Photoconductivity ,Radical ,General Medicine ,Polymer ,Photochemistry ,Biochemistry ,Acceptor ,Vinyl carbazole ,chemistry ,Molecule ,Physical and Theoretical Chemistry - Abstract
— In three-component systems, consisting of poly-N-vinyl-carbazole/acceptor/dye, besides the polymer band and the dye band, an additional peak in the photoconduction spectrum is formed. By comparison with the absorption spectrum, it is shown that the additional peak in the photoconduction spectrum cannot be related to the charge-transfer band between the polymer and the acceptor nor to the charge-transfer band between dye and acceptor. As an explanation, a cooperation of acceptor radicals — which are formed in the dark between poly-N-vinyl-carbazole and acceptor molecules — and dye is discussed.
- Published
- 2008
- Full Text
- View/download PDF
37. Optimization Framework for a Multiple Classifier System with Non-Registered Targets
- Author
-
Timothy W. Albrecht and Kenneth W. Bauer
- Subjects
Synthetic aperture radar ,Engineering ,business.industry ,Pattern recognition ,Quadratic classifier ,Sensor fusion ,Automatic target recognition ,Modeling and Simulation ,Margin classifier ,Artificial intelligence ,business ,Hidden Markov model ,Engineering (miscellaneous) ,Classifier (UML) ,Pose - Abstract
A fundamental problem facing the designers of automatic target recognition (ATR) systems is how to deal with out-of-library or non-registered targets. This research extends a mathematical programming framework that selects the optimal classifier ensemble and fusion method across multiple decision thresholds subject to classifier performance constraints. The extended formulation includes treatment of exemplars from target classes on which the ATR system is not trained (non-registered targets). Further, a multivariate Gaussian hidden Markov model (HMM) is developed and applied using real world synthetic aperture radar (SAR) data comprised of ten registered and five non-registered target classes. The framework is exercised in an experimental design across classifier fusion methods, prior probabilities of targets and non-targets, correlation between multiple sensor looks, and levels of target pose estimation error.
- Published
- 2008
- Full Text
- View/download PDF
38. Introduction to the special issue on statistical and probabilistic methods for user modeling
- Author
-
David W. Albrecht and Ingrid Zukerman
- Subjects
Computer science ,business.industry ,User modeling ,Divergence-from-randomness model ,Probabilistic logic ,Probabilistic database ,Recommender system ,Machine learning ,computer.software_genre ,Data science ,Computer Science Applications ,Education ,Human-Computer Interaction ,Web mining ,Artificial intelligence ,business ,computer ,Probabilistic relevance model ,Information explosion - Abstract
Statistical and probabilistic models are concerned with the use of observed sample results to make statements about unknown, dependent parameters. In user modeling, these parameters represent aspects of a user’s behaviour, such as his or her goals, preferences, and forthcoming actions or locations. Recent technological advances, in particular increased computational power, together with anytime, anyplace access to computers, and the information explosion associatedwith the Internet, provide new opportunities for information dissemination and information gathering. On one hand, people have access to large repositories of information in digital form. On the other hand, information providers can find out more about their users’ requirements by logging people’s activities. This mixture of vast electronic content and increased knowledge about people’s actions provides an opportunity to harness statistical and probabilistic models to build user models that support the delivery of personalized content. This usage of statistical and probabilistic models has been manifested in UMUAI for the last ten years. Particularly noteworthy are the articles in the Special Issue on Machine Learning for User Modeling (1998); the survey articles by Zukerman and Albrecht and byWebb et al. in the 10-year anniversary issue, respectively on predictive statistical models for user modeling (Zukerman and Albrecht 2001), and on machine learning for user modeling (Webb et al. 2001); Burke’s survey on recommender systems (Burke 2002); and Pierrakos et al.’s survey on Web usage mining (Pierrakos et al. 2003). These articles identified several challenges that user modeling presents to statistical and probabilistic modeling techniques. We classify these challenges into three categories: (1) limitations of current user modeling approaches, (2) dynamic nature of user modeling data, and (3) efficiency considerations.
- Published
- 2007
- Full Text
- View/download PDF
39. The Text and Performance ofEl caballero de Olmedo
- Author
-
Jane W. Albrecht
- Published
- 2006
- Full Text
- View/download PDF
40. Efficient Anomaly Detection by Isolation Using Nearest Neighbour Ensemble
- Author
-
Tharindu Bandaragoda, Fei Tony Liu, David W. Albrecht, Kai Ming Ting, and Jonathan R. Wells
- Subjects
Tree (data structure) ,ComputingMethodologies_PATTERNRECOGNITION ,Local outlier factor ,business.industry ,Nearest neighbour ,Anomaly detection ,Pattern recognition ,Artificial intelligence ,Isolation (database systems) ,business ,Time complexity ,Mathematics - Abstract
This paper presents iNNE (isolation using Nearest Neighbour Ensemble), an efficient nearest neighbour-based anomaly detection method by isolation. Inne runs significantly faster than existing nearest neighbour-based methods such as Local Outlier Factor, especially in data sets having thousands of dimensions or millions of instances. This is because the proposed method has linear time complexity and constant space complexity. Compared with the existing tree-based isolation method iForest, the proposed isolation method overcomes three weaknesses of iForest that we have identified, i.e., Its inability to detect local anomalies, anomalies with a low number of relevant attributes, and anomalies that are surrounded by normal instances.
- Published
- 2014
- Full Text
- View/download PDF
41. Change over Time in the Thermal Conductivity of Ten-Year-Old PUR Rigid Foam Boards with Diffusion-Open Facings
- Author
-
W. Albrecht
- Subjects
Change over time ,Thermal conductivity ,Materials science ,Polymers and Plastics ,Organic Chemistry ,Industrial research ,Full thickness ,Diffusion (business) ,Composite material - Abstract
To estimate the long-term change in thermal conductivity of PUR rigid foam, various accelerated ageing processes (e.g. 70 °C ageing procedure) are used. Results of ageing processes are safety increments to cover a lifetime in practice of between 25 and 50 years. To compare estimated increments with the change over time in thermal conductivity of real boards, a research project was started. The research project was sponsored by the German Polyurethane Producers’ Association (IVPU) and funds of the German Minister of Economics over the AIF (Association of Industrial Research Institutes). The tables and diagrams show the thermal conductivity of PUR boards in full thickness and the cell gas composition over a period of ten years. The shape of the curves and the measured values are compared with the typical diffusion coefficients and thermal conductivity of cell gases. The development of the curves shows that the fixed increments in the European PUR standard EN 13165, 2001–10 for pentane blown PUR foams are correctly calculated. These fixed increments provide assurance for the users of PUR rigid foam boards over very long periods (e.g. 25 years) and give confidence to the builders and the building supervisory authorities.
- Published
- 2004
- Full Text
- View/download PDF
42. A novel technique for preparation of aminated polyimide membranes with microfiltration characteristics
- Author
-
David L. Paul, F. Santoso, M. Schroeter, W. Albrecht, Reinhard Schomäcker, and Th. Weigel
- Subjects
Chemistry ,Microfiltration ,Ultrafiltration ,Synthetic membrane ,Filtration and Separation ,Biochemistry ,Reaction rate ,Membrane ,Chemical engineering ,Polymer chemistry ,Surface modification ,General Materials Science ,Amine gas treating ,Physical and Theoretical Chemistry ,Amination - Abstract
In the wet-chemical treatment of polyimide (PI) membranes with aminic modifiers, the modifier molecules will be covalently bound to the membrane polymer. Using modifiers with high aminic nitrogen the amination is combined with a degradation process, which shifts separation properties from ultrafiltration characteristics (untreated membrane) to microfiltration characteristics (treated membrane). Under optimal treatment conditions the steepness of separating curves of the aminated membranes is comparable with the steepness of separating curves of commercial microfilters. Characteristic data of membranes such as water permeability, amine content, SEM morphology, wettability and the dependence of membrane thickness on treatment conditions were presented. Data show that membrane properties are insensitive to modifier concentration in the treatment bath with respect to degradation reaction but sensitive to reaction rate. From the data of amine content per unit surface area of membrane and amine content per unit membrane weight, it was concluded that the functionalization/degradation process is divided in two steps. In the first step functionalization dominates, whereas in the second step functionalization and degradation are simultaneous processes resulting in an equilibrium state. The expected reaction sequence of degradable functionalization is proposed and discussed. Initial membrane morphology seems to be the key parameter for further investigations to optimize membrane preparation processes.
- Published
- 2003
- Full Text
- View/download PDF
43. Combined scintigraphic and pharmacokinetic investigation of enteric-coated mesalazine micropellets in healthy subjects
- Author
-
W. Albrecht, S. J. Tardif, Heather Wray, P. Bias, Ian R. Wilding, and C. Behrens
- Subjects
Hepatology ,Gastric emptying ,business.industry ,Gastroenterology ,Pharmacology ,Crossover study ,Bioavailability ,Excretion ,chemistry.chemical_compound ,Mesalazine ,chemistry ,Pharmacokinetics ,Medicine ,Ascending colon ,Pharmacology (medical) ,Dosing ,business - Abstract
Summary Background : There is a growing clinical trend to increase the daily dose of mesalazine, which leads to significant compliance issues associated with multiple dosings of current preparations. Aim : To examine the gastrointestinal performance and systemic exposure of a 1.5 g sachet (micropellets) mesalazine formulation, compared with three enteric-coated tablets (500 mg each, Claversal). Methods : A randomized, two-way, cross-over pharmacoscintigraphic (scintigraphy plus pharmacokinetics) study and a two-way, cross-over, pharmacokinetic-only study were performed in 24 healthy volunteers (12 subjects per investigation). Results : The relative bioavailability of mesalazine was 92% comparing micropellets with Claversal tablets, and the cumulative urine excretion was c. 26% for both preparations, suggesting comparable systemic exposure for the two types of preparation. In the majority of subjects, drug release from the micropellet formulation occurred predominantly in the terminal ileum and ascending colon. The Claversal tablets disintegrated in comparable intestinal sites, albeit at slightly later time points than the micropellets, principally due to slower gastric emptying for the single-unit formulation. Conclusion : The 1.5 g micropellet formulation offers comparable delivery properties to the marketed tablets, but with greater convenience of dosing.
- Published
- 2003
- Full Text
- View/download PDF
44. Magnesium therapy after aneurysmal subarachnoid haemorrhage a dose-finding study for long term treatment
- Author
-
K. W. Albrecht, J. W. Berkelbach van der Sprenkel, van den Walter Bergh, G. J. E. Rinkel, and Intensive Care Medicine
- Subjects
Adult ,Male ,Time Factors ,Subarachnoid hemorrhage ,medicine.medical_treatment ,chemistry.chemical_element ,Brain Ischemia ,Excretion ,Humans ,Medicine ,Magnesium ,Prospective Studies ,Infusions, Intravenous ,Prospective cohort study ,Aged ,Chemotherapy ,Dose-Response Relationship, Drug ,business.industry ,Vascular disease ,Middle Aged ,Subarachnoid Hemorrhage ,medicine.disease ,Dose–response relationship ,chemistry ,Anesthesia ,Injections, Intravenous ,Female ,Surgery ,Neurology (clinical) ,Complication ,business - Abstract
BACKGROUND: Magnesium is a neuroprotective agent which might prevent or reverse delayed cerebral ischemia (DCI) after aneurysmal subarachnoid haemorrhage (SAH). Although the dosage for short-term magnesium therapy is well established, there is lack of knowledge on the dosage for extended use of magnesium. Our aim was to find a dosage schedule of magnesium sulphate to maintain a serum magnesium level of 1.0-2.0 mmol/L for 14 days to cover the period of DCI. METHODS: We prospectively studied 14 patients admitted within 48 hours after aneurysmal subarachnoid haemorrhage (SAH) to our hospital. Magnesium sulphate was administrated intravenously for 14 days, using 3 different dosage schedules. Group A (n=3) received a bolus injection of 16 mmol magnesium sulphate followed by a continuous infusion of 16 mmol/daily; group B (n=6) a continuous infusion of 30 mmol/daily; and group C (n=5) a continuous infusion of 64 mmol/daily. Serum magnesium was measured at least every two days and all patients were under continuous observation during magnesium treatment. Renal magnesium excretion was measured only in group C. FINDINGS: In treatment group A the mean serum magnesium level during treatment was 1.03+/-0.14 (range 0.82-1.34) mmol/L, in group B 1.10+/-0.15 (range 0.87-1.43) mmol/L, and in group C 1.38+/-0.18 (range 1.11-1.98) mmol/L. The renal magnesium excretion in group C was equal to the administrated doses within 48 hours after treatment had started. All patients in group A reported a flushing sensation during the bolus injection; no other side effects were noted. INTERPRETATION: With a continuous intravenous dosage of 64 mmol/L per day, serum magnesium levels maintained within the range of 1.0-2.0 mmol/L for 14 days.
- Published
- 2003
- Full Text
- View/download PDF
45. Ultrafiltration and microfiltration membranes in latex purification by diafiltration with suction
- Author
-
R Hilke, K Luetzow, Z. Pientka, W Albrecht, Galina Tishchenko, Jan Schauer, and Miroslav Bleha
- Subjects
Diafiltration ,chemistry.chemical_compound ,Membrane ,Chromatography ,Membrane permeability ,Chemistry ,Microfiltration ,Ultrafiltration ,Filtration and Separation ,Polysulfone ,Permeation ,Dialysis (biochemistry) ,Analytical Chemistry - Abstract
Operation conditions of diafiltration with suction in purification of poly(glycidyl) methacrylate latex from sodium tetraborate and emulsifier were studied in a batch process using ultrafiltration blend polysulfone/poly(vinylpyrrolidone) and microfiltration Synpor® membranes. Intensity of permeate suction was controlled by changing the pumping rate at fixed cross-sections of the inlet tubes in both the retentate and permeate lines. An optimum value of flow rate was determined for each membrane type to ensure the best purification efficiency. Operating at this flow rate prevented not only undesirable dilution of the latex with osmotic water but also ensured the highest membrane permeability to solutes without cake formation on the membrane surface. It was shown that 92% degree of latex purification could be obtained by 8-h suction diafiltration with Synpor membrane having the pore entrance sizes close to nanoparticle dimensions. The possibility of complete purification of GMA nanoparticles from impurities using the hybrid membrane process combining dialysis followed by suction diafiltration with microporous membranes, and ultrafiltration with an appropriate membrane is discussed.
- Published
- 2003
- Full Text
- View/download PDF
46. Volatile organic compounds during inflammation and sepsis in rats: a potential breath test using ion-mobility spectrometry
- Author
-
Tobias, Fink, Alexander, Wolf, Felix, Maurer, Frederic W, Albrecht, Nathalie, Heim, Beate, Wolf, Anne C, Hauschild, Bertram, Bödeker, Jörg I, Baumbach, Thomas, Volk, Daniel I, Sessler, and Sascha, Kreuer
- Subjects
Inflammation ,Ions ,Male ,Rats, Sprague-Dawley ,Disease Models, Animal ,Volatile Organic Compounds ,Breath Tests ,Exhalation ,Sepsis ,Spectrum Analysis ,Animals ,Shock, Hemorrhagic ,Rats - Abstract
Multicapillary column ion-mobility spectrometry (MCC-IMS) may identify volatile components in exhaled gas. The authors therefore used MCC-IMS to evaluate exhaled gas in a rat model of sepsis, inflammation, and hemorrhagic shock.Male Sprague-Dawley rats were anesthetized and ventilated via tracheostomy for 10 h or until death. Sepsis was induced by cecal ligation and incision in 10 rats; a sham operation was performed in 10 others. In 10 other rats, endotoxemia was induced by intravenous administration of 10 mg/kg lipopolysaccharide. In a final 10 rats, hemorrhagic shock was induced to a mean arterial pressure of 35 ± 5 mmHg. Exhaled gas was analyzed with MCC-IMS, and volatile compounds were identified using the BS-MCC/IMS-analytes database (Version 1209; BS Analytik, Dortmund, Germany).All sham animals survived the observation period, whereas mean survival time was 7.9 h in the septic animals, 9.1 h in endotoxemic animals, and 2.5 h in hemorrhagic shock. Volatile compounds showed statistically significant differences in septic and endotoxemic rats compared with sham rats for 3-pentanone and acetone. Endotoxic rats differed significantly from sham for 1-propanol, butanal, acetophenone, 1,2-butandiol, and 2-hexanone. Statistically significant differences were observed between septic and endotoxemic rats for butanal, 3-pentanone, and 2-hexanone. 2-Hexanone differed from all other groups in the rats with shock.Breath analysis of expired organic compounds differed significantly in septic, inflammation, and sham rats. MCC-IMS of exhaled breath deserves additional study as a noninvasive approach for distinguishing sepsis from inflammation.
- Published
- 2014
47. Structural Sensitivity for the Knowledge Engineering of Bayesian Networks
- Author
-
Chris Whittle, David W. Albrecht, and Ann E. Nicholson
- Subjects
business.industry ,Node (networking) ,Knowledge engineering ,Bayesian network ,Mutual information ,Machine learning ,computer.software_genre ,Variable (computer science) ,Ranking ,Metric (mathematics) ,Data mining ,Artificial intelligence ,Sensitivity (control systems) ,business ,computer ,Mathematics - Abstract
Whether a Bayesian Network (BN) is constructed through expert elicitation, from data, or a combination of both, evaluation of the resultant BN is a crucial part of the knowledge engineering process. One kind of evaluation is to analyze how sensitive the network is to changes in inputs, a form of sensitivity analysis commonly called “sensitivity to findings”. The properties of d-separation can be used to determine whether or not evidence (or findings) about one variable may influence belief in a target variable, given the BN structure only. Once the network is parameterised, it is also possible to measure this influence, for example with mutual information or variance. Given such a metric of change, when evaluating a BN, it is common to rank nodes for either a maximum such effect or the average such effect. However this ranking tends to reflect the structural properties in the network: the longer the path from a node to the target node, the lower the influence, while the influence increases with the number of such paths. This raises the question: how useful is the ranking computed with the parameterised network, over and above what could be discerned from the structure alone? We propose a metric, Distance Weighted Influence, that ranks the influence of nodes based on the structure of the network alone. We show that not only does this ranking provide useful feedback on the structure in the early stages of the knowledge engineering process, after parameterisation the interest from an evaluation perspective is how much the ranking has changed. We illustrate the practical use of this on real-world networks from the literature.
- Published
- 2014
- Full Text
- View/download PDF
48. Selection and the Measured Black-White Wage Gap Among Young Women Revisited
- Author
-
James W. Albrecht, Aico van Vuuren, and Susan B. Vroman
- Published
- 2014
- Full Text
- View/download PDF
49. Formation of hollow fiber membranes from poly(ether imide) at wet phase inversion using binary mixtures of solvents for the preparation of the dope
- Author
-
W Albrecht
- Subjects
Filtration and Separation ,Ether ,Biochemistry ,Instantaneous phase ,chemistry.chemical_compound ,Membrane ,Chemical engineering ,chemistry ,Metastability ,Polymer chemistry ,Coagulation (water treatment) ,General Materials Science ,Fiber ,Physical and Theoretical Chemistry ,Phase inversion (chemistry) ,Spinning - Abstract
Macrovoidal morphologies are commonly generated when forming membranes by immersion precipitation at an instantaneous phase demixing. In this investigation, hollow fiber membranes from poly(ether imide) were prepared using instantaneous phase separation in dependence on the dope solvent composition allowing a defined and wide variation of the thermodynamical state of the spinning solution from a good to a metastable state. The results show that by a systematic variation of the dope solvent composition, the morphology of the membranes can be varied from macrovoidal to sponge-like structures despite instantaneous phase demixing. Two locations were observed at which the macrovoids were started. First beneath the primary coagulation front (type 1) and second far away of the primary coagulation front in the cross-section (type 2). The first type could correlate to the cloud points and is preferable induced by the thermodynamic state of the dope. Basing on the kinetic data here reported it was hypothesized that the second type starting far away from the primary coagulation layer is preferable induced by the kinetics of the phase inversion.
- Published
- 2001
- Full Text
- View/download PDF
50. Purification of polymer nanoparticles by diafiltration with polysulfone/hydrophilic polymer blend membranes
- Author
-
W Albrecht, Miroslav Bleha, K Luetzow, Jan Schauer, and G Tishchenko
- Subjects
chemistry.chemical_classification ,Glycidyl methacrylate ,Chromatography ,Materials science ,Nanoparticle ,Emulsion polymerization ,Filtration and Separation ,Polymer ,Analytical Chemistry ,chemistry.chemical_compound ,Diafiltration ,Membrane ,chemistry ,Chemical engineering ,Polysulfone ,Polymer blend - Abstract
The separation ability of new UF membranes based on blends of polysulfone (PS) with hydrophilic polymers (HP) was investigated in purification of polymer nanoparticle latices prepared by seeded emulsion polymerization. Low molecular weight components (emulsifier, salts, initiator) used in the synthesis of both glycidyl methacrylate (GMA) and sulfonated (SSS) nanoparticles as well as monomer residues were separated from nanoparticles of colloidal size (70–100 nm) by diafiltration using a three-compartment through-flow cell equipped with two membranes. In contrast to traditional pressure-driven diafiltration, this process is carried out under vacuum. The efficiency of new membranes and some commercial membranes were compared in the purification of nanoparticles. The influence of the nature of HP and structural characteristics of the skin layer of UF PS/HP membranes on their permeability to components of latex emulsions was estimated. The structure parameters of the skin membrane layer varied in the intervals: 1–7 nm (average diameter of pores), 0.6–49.2×10 10 (pore density) and 0.1–0.88% (relative surface porosity). For each type of membrane, an optimum correlation between the structure membrane parameters was found for ensuring the maximum permeate flux and purification degree of nanoparticle emulsions. It was shown that 95–98% and 85–89% purification degrees of GMA and SSS nanoparticle emulsions can be achieved by 6 h diafiltration with some UF blend PS/PVP membranes.
- Published
- 2001
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.