71 results
Search Results
2. New Fast Edge & Fast Edge Paper
- Subjects
Fasteners ,Time ,Technology ,Adhesives ,Architecture and design industries ,Business ,Construction and materials industries - Abstract
TRIM-TEX Choose Fast Edge for fast cornerbead install to get the job done on time and under budget. Fast Edge offers the strength you can feel confident in and the [...]
- Published
- 2019
3. New Products.
- Subjects
RUBBER coatings ,PAPER ,SINKS (Plumbing fixtures) ,FASTENERS - Abstract
The article offers information on C-103 line of synthetic rubber coating materials from Thiokol Corp., sink-strainer liners from Harvey Paper Products Co. and Quicflex paper fasteners.
- Published
- 1935
4. Tensor Nuclear Norm LPV Subspace Identification.
- Author
-
Gunes, Bilal, Van Wingerden, Jan-Willem, and Verhaegen, Michel
- Subjects
EXPONENTIAL functions ,REGULARIZATION parameter ,CLOSED loop systems ,SYSTEM identification ,A priori - Abstract
Linear parameter varying (LPV) subspace identification methods suffer from an exponential growth in number of parameters to estimate. This results in problems with ill-conditioning. In literature, attempts have been made to address the ill-conditioning by using regularization. Its effectiveness hinges on suitable a priori knowledge. In this paper, we propose using a novel, alternative regularization. That is, we first show that the LPV sub-Markov parameters can be organized into several tensors that are multilinear low rank by construction. Namely, their matricization along any mode is a low-rank matrix. Then, we propose a novel convex method with tensor nuclear norm regularization, which exploits this low-rank property. Simulation results show that the novel method can have higher performance than the regularized LPV-PBSID $_{\text{opt}}$ technique in terms of variance accounted for. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
5. Improving Sparsity and Scalability in Regularized Nonconvex Truncated-Loss Learning Problems.
- Author
-
Tao, Qing, Wu, Gaowei, and Chu, Dejun
- Subjects
SCALABILITY ,NONCONVEX programming ,PROBLEM solving - Abstract
The truncated regular $L_{1}$ -loss support vector machine can eliminate the excessive number of support vectors (SVs); thus, it has significant advantages in robustness and scalability. However, in this paper, we discover that the associated state-of-the-art solvers, such as difference convex algorithm and concave–convex procedure, not only have limited sparsity promoting property for general truncated losses especially the $L_{2}$ -loss but also have poor scalability for large-scale problems. To circumvent these drawbacks, we present a general multistage scheme with explicit interpretation regarding SVs as well as outliers. In particular, we solve the general nonconvex truncated loss minimization through a sequence of associated convex subproblems, in which the outliers are removed in advance. The proposed algorithm can be regarded as a structural optimization attempt carefully considering sparsity imposed by the nonconvex truncated losses. We show that this general multistage algorithm offers sufficient sparsity especially for the truncated $L_{2}$ -loss. To further improve the scalability, we propose a linear multistep algorithm by employing a single iteration of coordinate descent to monotonically decrease the objective function at each stage and a kernel algorithm by using the Karush–Kuhn–Tucker conditions to cheaply find most part of the outliers for the next stage. Comparison experiments demonstrate that our methods have superiority in sparsity as well as efficiency in scalability. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
6. Foam-Based Fasteners: No. 7,608,070; Fung-jou Chen, Julie Bednarz, Nadezhda Efremova, Sheng-Hsin Hu, Jeffrey Lindsay and Lisha Yu, assignors to Kimberly-Clark Worldwide, Inc., Neenah, Wis
- Subjects
Paper industry -- Intellectual property ,Hardware industry -- Intellectual property ,Fasteners ,Business ,Fashion, accessories and textiles industries - Abstract
Filed 9/30/2004. Issued 10/17/2009. A patent has been issued for foam-based fasteners. The article has a mechanical fastener. It is comprised of a body portion, which may include a fibrous [...]
- Published
- 2009
7. Disposable Absorbent Garment With Elastic Inner Layer Having Multiple Fasteners: No. 7,329,794; Paul Van Gompel and Georgia Zehner, Larsen, WI, assignors to Kimberly-Clark Worldwide, Neenah, WI
- Subjects
Paper industry ,Hardware industry ,Fasteners ,Clothing and dress ,Business ,Fashion, accessories and textiles industries - Abstract
Disposable Absorbent Garment With Elastic Inner Layer Having Multiple Fasteners: No. 7,329,794; Paul Van Gompel and Georgia Zehner, Larsen, WI, assignors to Kimberly-Clark Worldwide, Neenah, WI. Filed 12/31/03. A disposable [...]
- Published
- 2008
8. Optimizing Top-$k$Multiclass SVM via Semismooth Newton Algorithm.
- Author
-
Chu, Dejun, Lu, Rui, Li, Jin, Yu, Xintong, Zhang, Changshui, and Tao, Qing
- Subjects
SEMISMOOTH Newton methods ,SUPPORT vector machines ,ARTIFICIAL neural networks - Abstract
Top- $k$ performance has recently received increasing attention in large data categories. Advances, like a top- $k$ multiclass support vector machine (SVM), have consistently improved the top- $k$ accuracy. However, the key ingredient in the state-of-the-art optimization scheme based upon stochastic dual coordinate ascent relies on the sorting method, which yields $O(d\log d)$ complexity. In this paper, we leverage the semismoothness of the problem and propose an optimized top- $k$ multiclass SVM algorithm, which employs semismooth Newton algorithm for the key building block to improve the training speed. Our method enjoys a local superlinear convergence rate in theory. In practice, experimental results confirm the validity. Our algorithm is four times faster than the existing method in large synthetic problems; Moreover, on real-world data sets it also shows significant improvement in training time. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
9. Exclusive Sparsity Norm Minimization With Random Groups via Cone Projection.
- Author
-
Huang, Yijun and Liu, Ji
- Subjects
GENE expression ,MACHINE learning ,SIGNAL processing - Abstract
Many practical applications such as gene expression analysis, multitask learning, image recognition, signal processing, and medical data analysis pursue a sparse solution for the feature selection purpose and particularly favor the nonzeros evenly distributed in different groups. The exclusive sparsity norm has been widely used to serve to this purpose. However, it still lacks systematical studies for exclusive sparsity norm optimization. This paper offers two main contributions from the optimization perspective: 1) we provide several efficient algorithms to solve exclusive sparsity norm minimization with either smooth loss or hinge loss (nonsmooth loss). All algorithms achieve the optimal convergence rate $O(1/k^{2})$. ($k$ is the iteration number.) To the best of our knowledge, this is the first time to guarantee such convergence rate for the general exclusive sparsity norm minimization and 2) when the group information is unavailable to define the exclusive sparsity norm, we propose to use the random grouping scheme to construct groups and prove that if the number of groups is appropriately chosen, the nonzeros (true features) would be grouped in the ideal way with high probability. Empirical studies validate the efficiency of the proposed algorithms, and the effectiveness of random grouping scheme on the proposed exclusive support vector machine formulation. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
10. Incremental Design of Simplex Basis Function Model for Dynamic System Identification.
- Author
-
Yu, Juntang, Wang, Shuning, and Li, Li
- Subjects
SUPPORT vector machines ,ARTIFICIAL intelligence ,ARTIFICIAL neural networks - Abstract
In this paper, we propose a novel adaptive piecewise linear model for dynamic system identification. It has four unique features. First, the model designs a new kind of basis function for function approximation. It maintains the uniform shape for each basis function, so as to achieve a satisfactory tradeoff between generalization ability and model complexity. Second, the model takes the structure of basis functions as decision variables to optimize the formulated identification problems instead of taking expansion coefficients as decision variables as proposed by many existing approaches. Third, we establish an incremental design strategy to solve the system identification problems. In each step of the identification, the selection of optimal basis function is a Lipschitz continuous optimization problem that is likely to be easily handled with some mature toolboxes. This incremental design strategy greatly reduces the estimation cost. Fourth, we introduce a smoothing mechanism to avoid overfitting, when the output of dynamic systems is disturbed by noise. Tests on several benchmark dynamic systems demonstrate the potential of the proposed model. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
11. Online Feature Transformation Learning for Cross-Domain Object Category Recognition.
- Author
-
Zhang, Xuesong, Zhuang, Yan, Wang, Wei, and Pedrycz, Witold
- Subjects
FEATURE extraction ,PATTERN recognition systems ,DISTANCE education - Abstract
In this paper, we introduce a new research problem termed online feature transformation learning in the context of multiclass object category recognition. The learning of a feature transformation is viewed as learning a global similarity metric function in an online manner. We first consider the problem of online learning a feature transformation matrix expressed in the original feature space and propose an online passive aggressive feature transformation algorithm. Then these original features are mapped to kernel space and an online single kernel feature transformation (OSKFT) algorithm is developed to learn a nonlinear feature transformation. Based on the OSKFT and the existing Hedge algorithm, a novel online multiple kernel feature transformation algorithm is also proposed, which can further improve the performance of online feature transformation learning in large-scale application. The classifier is trained with k nearest neighbor algorithm together with the learned similarity metric function. Finally, we experimentally examined the effect of setting different parameter values in the proposed algorithms and evaluate the model performance on several multiclass object recognition data sets. The experimental results demonstrate the validity and good performance of our methods on cross-domain and multiclass object recognition application. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
12. Solution Path for Pin-SVM Classifiers With Positive and Negative $\tau $ Values.
- Author
-
Huang, Xiaolin, Shi, Lei, and Suykens, Johan A. K.
- Subjects
SUPPORT vector machines ,ERROR - Abstract
Applying the pinball loss in a support vector machine (SVM) classifier results in pin-SVM. The pinball loss is characterized by a parameter $\tau $ . Its value is related to the quantile level and different $\tau $ values are suitable for different problems. In this paper, we establish an algorithm to find the entire solution path for pin-SVM with different $\tau $ values. This algorithm is based on the fact that the optimal solution to pin-SVM is continuous and piecewise linear with respect to $\tau $ . We also show that the nonnegativity constraint on $\tau $ is not necessary, i.e., $\tau $ can be extended to negative values. First, in some applications, a negative $\tau $ leads to better accuracy. Second, $\tau = -1$ corresponds to a simple solution that links SVM and the classical kernel rule. The solution for $\tau = -1$ can be obtained directly and then be used as a starting point of the solution path. The proposed method efficiently traverses $\tau $ values through the solution path, and then achieves good performance by a suitable $\tau $ . In particular, $\tau = 0$ corresponds to C-SVM, meaning that the traversal algorithm can output a result at least as good as C-SVM with respect to validation error. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
13. one and done.
- Author
-
HORACZEK, STAN and VERGER, ROB
- Subjects
COAL mine waste ,FASTENERS ,CHROME steel ,MILITARY air pilots ,CONCRETE masonry - Abstract
Otto Rohwedder reinvented bread when he created the first machine to slice it. Your name for the venerable CMU probably depends upon where you live, but cinder block, breeze block, and hollow block all refer to an 8by8by16inch brick with two or three internal voids. Your name for the venerable CMU probably depends upon where you live, but cinder block, breeze block, and hollow block all refer to an 8by8by16inch brick with two or three internal voids. The Piper J-3 Cub was a cheap, simple, and quick machine with two seats placed one behind the other inside a tubular steel frame wrapped in cotton fabric. [Extracted from the article]
- Published
- 2020
14. Vibrational characteristics of fastening of a spherical shell with a coupled conical-conical shells strengthened with nanocomposite sandwiches contained agglomerated CNT face layers and GNP core under spring boundary conditions.
- Author
-
Sobhani, Emad
- Subjects
- *
NANOCOMPOSITE materials , *SHEAR (Mechanics) , *DIFFERENTIAL equations , *FASTENERS , *BIVALVE shells , *NUCLEUS accumbens , *LAMINATED composite beams - Abstract
• Obtaining the natural frequency of the Fastened Spherical-Conical-Conical Shells (FSCCSs). • Analyzing 12 different geometry types related to the FSCCSs. • Employing four new nanocomposite sandwiches composed of agglomerated CNT face layers and GNP core to reinforce the FSCCSs. • Determining the primary relationships of the FSCCS shell segments by using the FSDT and Donnell's shell theory and discretizing the governing differential equations of the FSCCSs by using the meshless GDQ method. • Applying the artificial springs technique to provide elastic boundary conditions related to the FSCCS. The following paper is prepared to compute the Natural Frequency Parameters (NFPs) linked to the Fastened Spherical-Conical-Conical Shell (FSCCS) structures composted of Nanocomposite Sandwich (NS) materials under the effects of springs as Boundary Conditions (BCs). To explain, the NS material implemented here is made of three layers covering Top Face Layer (TFL), Bottom Face Layer (BFL), and Core Layer (CL). In addition, the TFL and BFL are composed of Carbon Nano Tube (CNT) nano-fillers with agglomeration characteristics, while the CL benefits from Graphene Nano Platelet (GNP) nano-materials. For more information, the mechanical values related to the Agglomerated Carbon Nano-Tube Nanocomposite (ACNTN) material composer the TFL and BFL (ACNTN-TFL and ACNTN-BFL) are carried out by the Eshelby-Mori-Tanaka Approach (EMTA). At the same time, the Halpin-Tsai Approach (HTA) is used to extract these values for GNP-CL. To add more, four different sandwich models are implemented by considering the compatibility between layers. Moreover, the First Order Shear Deformation Theory (FOSDT) and Donnell's Shell Theory (DST) are exploited to obtain the primary relationships of the FSCCS segments. Hamilton's strategy is also operated to derive the Governing Differential Equations (GDEs) of the FSCCS's components. Further, the well-systematized computational meshless method tagged the Generalized Differential Quadrature (GDQ) program is carried out to discrete the GDEs coupled with the structure's segments. Furthermore, the calculation of the eigenvalues is implemented to compute the NFPs of the FSCCS structure. After that, to credit the submitted program, the outputs related to the FSCCS calculated by the present framework are analogized with the responses linked to the FE-based commercial software. In the last, multiple examples are styled and computed to point to the influences of the different material, geometric, and BCs forms on the NFPs of the NS-FSCCS structures, including 12 different geometry types related to the NS-FSCCS. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
15. A Divide-and-Conquer Method for Scalable Robust Multitask Learning.
- Author
-
Pan, Yan, Xia, Rongkai, Yin, Jian, and Liu, Ning
- Subjects
ROBUST control ,COMPUTER multitasking ,LOW-rank matrices ,SINGULAR value decomposition ,LEAST squares - Abstract
Multitask learning (MTL) aims at improving the generalization performance of multiple tasks by exploiting the shared factors among them. An important line of research in the MTL is the robust MTL (RMTL) methods, which use trace-norm regularization to capture task relatedness via a low-rank structure. The existing algorithms for the RMTL optimization problems rely on the accelerated proximal gradient (APG) scheme that needs repeated full singular value decomposition (SVD) operations. However, the time complexity of a full SVD is O(\min (md^2,m^2d)) for an RMTL problem with $m$ tasks and $d$ features, which becomes unaffordable in real-world MTL applications that often have a large number of tasks and high-dimensional features. In this paper, we propose a scalable solution for large-scale RMTL, with either the least squares loss or the squared hinge loss, by a divide-and-conquer method. The proposed method divides the original RMTL problem into several size-reduced subproblems, solves these cheaper subproblems in parallel by any base algorithm (e.g., APG) for RMTL, and then combines the results to obtain the final solution. Our theoretical analysis indicates that, with high probability, the recovery errors of the proposed divide-and-conquer algorithm are bounded by those of the base algorithm. Furthermore, in order to solve the subproblems with the least squares loss or the squared hinge loss, we propose two efficient base algorithms based on the linearized alternating direction method, respectively. Experimental results demonstrate that, with little loss of accuracy, our method is substantially faster than the state-of-the-art APG algorithms for RMTL. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
16. Have you ever wondered...?
- Subjects
FASTENERS ,STAINLESS steel ,EXPORTERS ,FOOD processor cooking ,COOKING ,KITCHEN utensils - Abstract
There are countless colourful things out there to pick up, toy with, and perhaps pause to admire. During the Nazi occupation, Norwegians wore paper clips on their lapels as a sign of national unity and opposition to the Germans, an act that could result in arrest. The first member of the Alessi family to craft metal was Giovanni Alessi, who founded his company by the shores of Lake Orta. Design became an increasingly important element, while stainless steel took over as the primary metal being used, Alessi now exports 65 per cent of all it manufactures. The first food processors were cumbersome and designed to be used in commercial kitchens, but miniaturisation soon saw them entering the domestic sphere. The Kenwood Chef appeared in Great Britain in 1950. Soon after, Sunbeam's sleek, chrome-plated Mixmaster drew heavily on automotive design influences.
- Published
- 2004
17. New Products.
- Subjects
COMMERCIAL products ,FASTENERS - Abstract
The article presents information on several commercial products including silent radio by Dictograph Products, jet-black rustproof finish by Glidden and Parker Rust Proof, and Fastnrite paper fastener by Progressive Mechanical.
- Published
- 1936
18. Product Index.
- Subjects
FLEXIBLE manufacturing systems ,FASTENERS ,ELECTRIC power distribution equipment ,CLEANING equipment ,AUTOMATED storage retrieval systems ,BELT conveyors ,ADHESIVE tape ,MATERIALS handling equipment - Published
- 2020
19. Online Learning Algorithms Can Converge Comparably Fast as Batch Learning.
- Author
-
Lin, Junhong and Zhou, Ding-Xuan
- Subjects
MACHINE learning ,MATHEMATICAL programming ,ARTIFICIAL neural networks - Abstract
Online learning algorithms in a reproducing kernel Hilbert space associated with convex loss functions are studied. We show that in terms of the expected excess generalization error, they can converge comparably fast as corresponding kernel-based batch learning algorithms. Under mild conditions on loss functions and approximation errors, fast learning rates and finite sample upper bounds are established using polynomially decreasing step-size sequences. For some commonly used loss functions for classification, such as the logistic and the p -norm hinge loss functions with p\in [{1,2}] , the learning rates are the same as those for Tikhonov regularization and can be of order O(T^{- {(1 / 2)}} \log T)$ , which are nearly optimal up to a logarithmic factor. Our novelty lies in a sharp estimate for the expected values of norms of the learning sequence (or an inductive argument to uniformly bound the expected risks of the learning sequence in expectation) and a refined error decomposition for online learning algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
20. A Novel Twin Support-Vector Machine With Pinball Loss.
- Author
-
Xu, Yitian, Yang, Zhiji, and Pan, Xianli
- Subjects
SUPPORT vector machines ,QUADRATIC programming ,COMPUTATIONAL complexity - Abstract
Twin support-vector machine (TSVM), which generates two nonparallel hyperplanes by solving a pair of smaller-sized quadratic programming problems (QPPs) instead of a single larger-sized QPP, works faster than the standard SVM, especially for the large-scale data sets. However, the traditional TSVM adopts hinge loss which easily leads to its sensitivity of the noise and instability for resampling. To enhance the performance of the TSVM, we present a novel TSVM with the pinball loss (Pin-TSVM) which deals with the quantile distance and is less sensitive to noise points. We further investigate its properties, including the noise insensitivity, between-class distance maximization, and within-class scatter minimization. In addition, we compare our Pin-TSVM with the twin parametric-margin SVM and the SVM with the pinball loss in theory. Numerical experiments on a synthetic data set and 14 benchmark data sets with different noises demonstrate the feasibility and validity of our proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
21. Top- Partial Label Machine.
- Author
-
Gong, Xiuwen, Yuan, Dong, and Bao, Wei
- Subjects
LINEAR programming ,PHASE-locked loops ,MATHEMATICAL optimization ,MACHINERY - Abstract
To deal with ambiguities in partial label learning (PLL), the existing PLL methods implement disambiguations, by either identifying the ground-truth label or averaging the candidate labels. However, these methods can be easily misled by the false-positive labels in the candidate label set. We find that these ambiguities often originate from the noise caused by highly correlated or overlapping candidate labels, which leads to the difficulty in identifying the ground-truth label on the first attempt. To give the trained models more tolerance, we first propose the top- $k$ partial loss and convex top- $k$ partial hinge loss. Based on the losses, we present a novel top- $k$ partial label machine (TPLM) for partial label classification. An efficient optimization algorithm is proposed based on accelerated proximal stochastic dual coordinate ascent (Prox-SDCA) and linear programming (LP). Moreover, we present a theoretical analysis of the generalization error for TPLM. Comprehensive experiments on both controlled UCI datasets and real-world partial label datasets demonstrate that the proposed method is superior to the state-of-the-art approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
22. Scalable Nonparametric Low-Rank Kernel Learning Using Block Coordinate Descent.
- Author
-
Hu, En-Liang and Kwok, James T.
- Subjects
NONPARAMETRIC estimation ,MATHEMATICAL statistics ,ALGORITHMS ,KERNEL operating systems ,COMPUTER operating systems - Abstract
Nonparametric kernel learning (NPKL) is a flexible approach to learn the kernel matrix directly without assuming any parametric form. It can be naturally formulated as a semidefinite program (SDP), which, however, is not very scalable. To address this problem, we propose the combined use of low-rank approximation and block coordinate descent (BCD). Low-rank approximation avoids the expensive positive semidefinite constraint in the SDP by replacing the kernel matrix variable with \mathbfV^\top \mathbfV , where \mathbfV is a low-rank matrix. The resultant nonlinear optimization problem is then solved by BCD, which optimizes each column of \mathbfV sequentially. It can be shown that the proposed algorithm has nice convergence properties and low computational complexities. Experiments on a number of real-world data sets show that the proposed algorithm outperforms state-of-the-art NPKL solvers. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
23. On the Rates of Convergence From Surrogate Risk Minimizers to the Bayes Optimal Classifier.
- Author
-
Zhang, Jingwei, Liu, Tongliang, and Tao, Dacheng
- Subjects
NP-hard problems ,LOGISTIC regression analysis ,SUPPORT vector machines ,SAMPLE size (Statistics) - Abstract
In classification, the use of 0–1 loss is preferable since the minimizer of 0–1 risk leads to the Bayes optimal classifier. However, due to the nonconvexity of 0–1 loss, this optimization problem is NP-hard. Therefore, many convex surrogate loss functions have been adopted. Previous works have shown that if a Bayes-risk consistent loss function is used as a surrogate, the minimizer of the empirical surrogate risk can achieve the Bayes optimal classifier as the sample size tends to infinity. Nevertheless, the comparison of convergence rates of minimizers of different empirical surrogate risks to the Bayes optimal classifier has rarely been studied. Which characterization of the surrogate loss determines its convergence rate to the Bayes optimal classifier? Can we modify the loss function to achieve a faster convergence rate? In this article, we study the convergence rates of empirical surrogate minimizers to the Bayes optimal classifier. Specifically, we introduce the notions of consistency intensity and conductivity to characterize a surrogate loss function and exploit this notion to obtain the rate of convergence from an empirical surrogate risk minimizer to the Bayes optimal classifier, enabling fair comparisons of the excess risks of different surrogate risk minimizers. The main result of this article has practical implications including: 1) showing that hinge loss (SVM) is superior to logistic loss (Logistic regression) and exponential loss (Adaboost) in the sense that its empirical minimizer converges faster to the Bayes optimal classifier and 2) guiding the design of new loss functions to speed up the convergence rate to the Bayes optimal classifier with a data-dependent loss correction method inspired by our theorems. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
24. EVERY OBITUARY'S FIRST PARAGRAPH.
- Author
-
ZAUZMER, EMILY
- Subjects
FASTENERS ,MONOPOLIES ,HEALTH insurance ,PRODUCT design ,ACTING awards - Published
- 2024
25. FASTENER FOR TOLERANCE COMPENSATION.
- Subjects
FASTENERS - Published
- 2021
26. SAVING FASTENERS WITH TEFLON TAPE.
- Author
-
WARGASKI, ALAN
- Subjects
SAVINGS ,FASTENERS ,ADHESIVE tape ,ATHLETIC tape - Abstract
The author offers suggestions on avoiding breaking fasteners to make fastening into hardwood and mentions wrapping Teflon tape around threads to lubricate the thread not leaving residue behind.
- Published
- 2020
27. Product Highlights.
- Subjects
FASTENERS ,ELECTRIC connectors ,FACTORIES ,TEMPERATURE control ,HEAT treatment ,PROCESS control systems - Abstract
The article evaluates process heating equipment which include the Mix-Proof Valve for Dairy Applications from SPX Flow, the Sanitary Steam Heater for Starch Cooking from Komax Systems Inc., and the Flow Control Ball from Valve Asahi/America Inc.
- Published
- 2020
28. Automated Dispensing in Aerospace.
- Author
-
Sprovieri, John
- Subjects
AEROSPACE materials ,FASTENERS ,ROBOTICS ,ASSEMBLY line methods ,FILLER materials ,SPECIFIC gravity - Published
- 2019
29. Discriminative Feature Selection via Employing Smooth and Robust Hinge Loss.
- Author
-
Peng, Hanyang and Liu, Cheng-Lin
- Subjects
FEATURE selection ,ROBUST control ,SUPPORT vector machines - Abstract
A wide variety of sparsity-inducing feature selection methods have been developed in recent years. Most of the loss functions of these approaches are built upon regression since it is general and easy to optimize, but regression is not well suitable for classification. In contrast, the hinge loss (HL) of support vector machines has proved to be powerful to handle classification tasks, but a model with existing multiclass HL and sparsity regularization is difficult to optimize. In view of that, we propose a new loss, called smooth and robust HL, which gathers the merits of regression and HL but overcome their drawbacks, and apply it to our sparsity regularized feature selection model. To optimize the model, we present a new variant of accelerated proximal gradient (APG) algorithm, which boosts the discriminative margins among different classes, compared with standard APG algorithms. We further propose an efficient optimization technique to solve the proximal projection problem at each iteration step, which is a key component of the new APG algorithm. We theoretically prove that the new APG algorithm converges at rate $O({1}/{k^{2}})$ if it is convex ($k$ is the iteration counter), which is the optimal convergence rate for smooth problems. Experimental results on nine publicly available data sets demonstrate the effectiveness of our method. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
30. Drywall Repair.
- Author
-
Berendsohn, Roy
- Subjects
- *
DRYWALL , *WALLBOARD , *REPAIRING , *MAINTENANCE , *FASTENERS , *HARDWARE - Abstract
The article informs that there's nothing like drywall repair to get the creative juices flowing. The article discusses various techniques for drywall repair. Cut a square piece of drywall to cover the hole, and then on the back of the patch remove a thin strip of drywall along each edge without removing the paper. Done correctly, this forms a paper flange on the patch's perimeter. Apply compound to the paper and press the patch in place. There is no tape, no backing board, no screws, no finding studs; Install a hinge-pin doorstop or a baseboard stop to prevent future doorknob damage. And use an 8-in.-wide drywall knife, as opposed to a 6-in., for a smoother repair.
- Published
- 2005
31. Plastic Rivet.
- Subjects
RIVETS & riveting ,FASTENERS - Abstract
The article reviews new plastic rivets as of August 1, 1944.
- Published
- 1944
32. On the Impact of Regularization Variation on Localized Multiple Kernel Learning.
- Author
-
Han, Yina, Yang, Kunde, Yang, Yixin, and Ma, Yuanliang
- Subjects
MACHINE learning ,SUPPORT vector machines ,KERNEL operating systems - Abstract
This brief analyzes the effects of regularization variations in the localized kernel weights on the hypothesis generated by localized multiple kernel learning (LMKL) algorithms. Recent research on LMKL includes imposing different regularizations on the localized kernel weights and has led to varying formulations and solution strategies. Following the stability analysis theory as presented by Bousquet and Elisseeff, we give stability bounds based on the norm of the variation of localized kernel weights for three LMKL methods cast in the support vector machine classification framework, including vector \ell p -norm LMKL, matrix-regularized $(r,p)$ -norm LMKL, and samplewise \ell p -norm LMKL. Further comparison of these bounds helps to qualitatively reveal the performance differences produced by these regularization methods, that is, matrix-regularized LMKL achieves superior performance, followed by vector \ell p -norm LMKL and samplewise \ell p -norm LMKL. Finally, a set of experimental results on ten benchmark machine learning UCI data sets is reported and shown to empirically support our theoretical analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
33. Groupwise Retargeted Least-Squares Regression.
- Author
-
Wang, Lingfeng and Pan, Chunhong
- Subjects
LEAST squares ,REGRESSION analysis ,BIG data ,MATHEMATICAL models ,QUADRATIC equations - Abstract
In this brief, we propose a new groupwise retargeted least squares regression (GReLSR) model for multicategory classification. The main motivation behind GReLSR is to utilize an additional regularization to restrict the translation values of ReLSR, so that they should be similar within same class. By analyzing the regression targets of ReLSR, we propose a new formulation of ReLSR, where the translation values are expressed explicitly. On the basis of the new formulation, discriminative least-squares regression can be regarded as a special case of ReLSR with zero translation values. Moreover, a groupwise constraint is added to ReLSR to form the new GReLSR model. Extensive experiments on various machine leaning data sets illustrate that our method outperforms the current state-of-the-art approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
34. Fasteners for Foam and Honeycomb.
- Author
-
Sprovieri, John
- Subjects
FASTENERS ,WELDING ,REINFORCED plastics - Abstract
The article reports on a fastening process developed by Weber Screwdriving Systems and fastener manufacturer EJOT to solve the parcel shelf issue faced by automaker Audi. Topics include the elements of automatic screwdriving, plastic fastener anchors and spin-welding technology used by the fastening process for plastic parts assembly, and the role of the system in fastening foam and honeycomb core structures, carbon-fixed-reinforced plastic and glass-reinforced plastic.
- Published
- 2016
35. Feedback Stabilization for the Mass Balance Equations of an Extrusion Process.
- Author
-
Diagne, Mamadou, Shang, Peipei, and Wang, Zhiqiang
- Subjects
FEEDBACK control systems ,EXTRUSION process ,MASS budget (Geophysics) ,HYPERBOLIC differential equations ,PARTIAL differential equations ,ORDINARY differential equations - Abstract
In this article, we study the stabilization problem for an extrusion process in the isothermal case. The model expresses the mass conservation in the extruder chamber and consists of a hyperbolic Partial Differential Equation (PDE) and a nonlinear Ordinary Differential Equation (ODE) whose dynamics describes the evolution of a moving interface. By using a Lyapunov approach, we obtain the exponential stabilization for the closed-loop system under natural feedback controls through indirect measurements. Numerical simulations are also provided with a comparison between the proposed approach and linear PI feedback controller. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
36. Retargeted Least Squares Regression Algorithm.
- Author
-
Zhang, Xu-Yao, Wang, Lingfeng, Xiang, Shiming, and Liu, Cheng-Lin
- Subjects
ISOTONIC regression ,ALGORITHMS ,LEAST squares ,MATHEMATICAL statistics ,SUM of squares - Abstract
This brief presents a framework of retargeted least squares regression (ReLSR) for multicategory classification. The core idea is to directly learn the regression targets from data other than using the traditional zero–one matrix as regression targets. The learned target matrix can guarantee a large margin constraint for the requirement of correct classification for each data point. Compared with the traditional least squares regression (LSR) and a recently proposed discriminative LSR models, ReLSR is much more accurate in measuring the classification error of the regression model. Furthermore, ReLSR is a single and compact model, hence there is no need to train two-class (binary) machines that are independent of each other. The convex optimization problem of ReLSR is solved elegantly and efficiently with an alternating procedure including regression and retargeting as substeps. The experimental evaluation over a range of databases identifies the validity of our method. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
37. What turing himself said about the imitation game.
- Author
-
Proudfoot, Diane
- Subjects
CODING theory ,DATA compression ,COMPUTERS ,MACHINE theory ,BIOGRAPHICAL films - Abstract
The imitation game, the recent biopic about Alan Turing's efforts to decipher Nazi naval codes, was showered with award nominations. It even won the 2015 Academy Award for Best Adapted Screenplay. One thing it won't win any awards for, though, is its portrayal of the "imitation game" itself- Turing's proposed test of machine thinking, which hinges on whether a computer can convincingly imitate a person. The Turing test, as it is now called, doesn't really feature in the file. (Given that the movie gets so much of the history wrong, perhaps that's a good thing.) [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
38. Keep Your Bearings.
- Author
-
Cloepfil, John
- Subjects
BEARINGS (Machinery) ,ADHESIVE tape ,TIRES ,WHEELS ,FASTENERS - Abstract
The article provides a tip for keeping hub bearings in place when a flat tire is changed on a caster wheel. The ends of the hub may be covered with duct tape. There is no need to remove the tape after the tire is remounted. Simply poke the hole in the tape with a punch and reinsert the bolt that fastens the wheel to the equipment.
- Published
- 2000
39. Power projects.
- Subjects
WOOD ,TEXTILES ,POWER tools ,FASTENERS ,HARDWARE ,ADHESIVES - Abstract
The article presents suggestions to personalise one's home with great-looking DIY projects that take very little time to create. All one needs are basic power tools, easy-to-find materials and a little bit of elbow grease. 25mm-thick wooden towel or similar 8 x 8g 25mm countersunk wood screws liquid adhesive and craft glue felt to cover 3 table bases are required. Cordless or power drill with both metal and wood bits will be used in the following sizes; 5 mm bit, 25 mm bit and 35 mm spade bit; welder; From the company, Phillips, head screwdriver; scissors; backsaw, handsaw, spanner or adjustable wrench will be needed.
- Published
- 2004
40. NEW + NOTEWORTHY.
- Subjects
BOLTED joints ,DRAWING (Metalwork) ,MATERIALS handling ,INLET valves ,COMPUTER-aided design software ,FASTENERS - Published
- 2020
41. TALKING SHOP WITH ESQUIRE.
- Subjects
ADHESIVES ,FASHION accessories ,FASTENERS - Published
- 1950
42. NEW PRODUCTS.
- Subjects
LOUDSPEAKERS ,VACUUM lifters ,FASTENERS - Abstract
The article offers brief information on various products including Radio Doraphone speaker from Setchell Carlson Inc., Red Devil Vacuum Cup Lifter from Landon P. Smith Inc., and Hold Rite Attachment shower curtain magnetic fastener from Artcraft Waterproof Products Co.
- Published
- 1940
43. NEW USES FOR OLD THINGS.
- Author
-
Page, Melinda and Wells, Elizabeth
- Subjects
HOUSEHOLD supplies ,NEEDLES & pins ,SEWING ,PENCILS ,FASTENERS ,ZIPPERS - Abstract
The article presents information about how few ordinary household things can be used for other purposes. A matchbox can be used for storing a tiny travel sewing kit. A matchbox is the perfect size for holding needles, thread, buttons, and a few safety pins. The pencil lead can be used for unsticking a stubborn zipper. Just rub the teeth on both sides of the zipper with a pencil. Drinking straw can be used for extending the stems of roses and ranunculus to fit a vase. Place the bottom of each stem inside a straw, then arrange the flowers as you normally would.
- Published
- 2006
44. ASSEMBLY EQUIPMENT AND FASTENERS.
- Subjects
FASTENERS ,JOINTS (Engineering) ,WOODWORKING machinery ,WOODWORKING industries ,WOODWORK equipment ,EQUIPMENT & supplies - Abstract
The article features wood assembly equipment and fasteners from different companies in the U.S. The products featured include push button clamping system from James L. Taylor Manufacturing, Tritec drills from Gannormat, six types of dowels from Miller Dowel, MPH 400 multipurpose case clamp from Ligmatech and Sprint PTP drilling and dowel insertion machine from Koch.
- Published
- 2006
45. Two ways you can insulate basement walls.
- Author
-
Powell, Evan
- Subjects
HOUSE insulation ,THERMAL insulation ,BASEMENTS ,WALLS ,FASTENERS ,ADHESIVES - Abstract
The article offers information on the ways of insulating a basement wall. It suggests to perform the extruded polysyterene sheating, such as the Dow Chemical tongue-and-groove Styrofoam panels that are applied directly to the wall with adhesive. It also recommends the newer insulating panel is Rmax Thermawall, which can be easily installed to use the special Thermawall fastening system.
- Published
- 1982
46. No-show for metal siding.
- Subjects
FASTENERS - Abstract
The article reports that Armco Steel Corp. is expecting to launch its full-scale production of pre-painted fasteners by fall 1969 if pilot tests for its electro-painting process will prove successful.
- Published
- 1969
47. ASSEMBLY Online.
- Subjects
FASTENERS ,CONSCIOUS automata ,CONVEYING machinery ,CONVEYING machinery industry ,ROBOTS - Published
- 2019
48. INNOVATIONS.
- Author
-
Rehana, Sharon J.
- Subjects
COMMERCIAL products ,FASTENERS ,CONCRETE products ,STONE - Abstract
The article offers brief information on several commercial products including the Handy Camel bag clips, NighTec Leuchtsteine light concrete stones from Kann, and concrete wallpaper designed by Piet Boon.
- Published
- 2013
49. 4 NEW USES FOR BINDER CLIPS.
- Author
-
Chantim, Andra and Edelstein, Julia
- Subjects
FASTENERS ,OFFICE equipment & supplies - Abstract
The article discusses four new uses of binder clips including as sponge stand, sealing the bags of vegetables using binder clips before putting into the freezer, and as grocer-bag organizer.
- Published
- 2012
50. Micro Plastics Snap Rivets.
- Subjects
RIVETS & riveting ,FASTENERS ,WOODWORKING machinery ,WOODWORK equipment ,HARDWARE ,CARPENTRY - Abstract
The article states that Micro Plastics Inc. announced that it has added 34 new sizes to its present line of "Snap Rivets." These new "Snap Rivets" now include sizes to fit .083- to .248-in. hole diameters, with a panel thickness of .024 to .400 in. The rivets were designed for assemblies that can be installed without tools. The two-piece molded fastener comes assembled and ready for use. One can simply place the rivet into a hole, and press the oversized head. The specially designed legs expand and firmly lock the components permanently in place.
- Published
- 2005
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.