140,352 results
Search Results
2. A novel artificial neural network approach for residual life estimation of paper insulation in oil‐immersed power transformers.
- Author
-
Nezami, Md. Manzar, Equbal, Md. Danish, Ansari, Md. Fahim, Alotaibi, Majed A., Malik, Hasmat, García Márquez, Fausto Pedro, and Hossaini, Mohammad Asef
- Subjects
- *
ARTIFICIAL neural networks , *POWER transformers , *TRANSFORMER insulation , *ARTIFICIAL intelligence , *MATHEMATICAL optimization - Abstract
Avoiding financial losses requires preventing catastrophic oil‐filled power transformer breakdowns. Continuous online transformer monitoring is needed. The authors use paper insulation to evaluate transformer health for continuous online transformer monitoring. The study suggests a new artificial intelligence method for estimating paper insulation residual life in oil‐immersed power transformers. The four artificial intelligence models use backpropagation‐based neural networks to predict paper insulation lifespan. Four primary transformer insulating paper failure indices—degree of polymerisation, 2‐furfuraldehyde, carbon monoxide, and carbon dioxide—form the basis of these models. Each model, including the backpropagation‐based neural networks, estimates paper insulation life using one failure index, along with moisture and temperature data. Optimisation techniques enhance hidden layer neurons and epoch count for improved performance. Results are validated against literature‐based life models, establishing a precise input–output correlation. This method accurately predicts the remaining useable life of power transformer paper insulation, enabling utilities to take proactive measures for safe and efficient transformer operation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Scheduling Heating Tasks on Parallel Furnaces with Setup Times and Conflicts
- Author
-
Lange, Julia, Fath, Philipp, Sayah, David, Barbosa-Povoa, Ana Paula, Editorial Board Member, de Almeida, Adiel Teixeira, Editorial Board Member, Gans, Noah, Editorial Board Member, Gupta, Jatinder N. D., Editorial Board Member, Heim, Gregory R., Editorial Board Member, Hua, Guowei, Editorial Board Member, Kimms, Alf, Editorial Board Member, Li, Xiang, Editorial Board Member, Masri, Hatem, Editorial Board Member, Nickel, Stefan, Editorial Board Member, Qiu, Robin, Editorial Board Member, Shankar, Ravi, Editorial Board Member, Slowiński, Roman, Editorial Board Member, Tang, Christopher S., Editorial Board Member, Wu, Yuzhe, Editorial Board Member, Zhu, Joe, Editorial Board Member, Zopounidis, Constantin, Editorial Board Member, Trautmann, Norbert, editor, and Gnägi, Mario, editor
- Published
- 2022
- Full Text
- View/download PDF
4. Theoretical analysis and design of roller mower straight blade.
- Author
-
Zhang, Lingyan, Yao, Cheng, Ying, Weiqiang, Luo, Shijian, and Ying, Fangtian
- Subjects
- *
PAPER arts , *MATHEMATICAL optimization , *STRUCTURAL optimization , *ENERGY consumption , *CONSUMPTION (Economics) - Abstract
In order to study the cutting performance of the straight edge hob and reduce the cutting power consumption model of the straight edge hob, this paper takes the cutting power of the straight edge hob as the minimum goal, and establishes the mathematical model of the optimization design of the straight edge hob based on the composite optimization method. The mathematical model is solved by MATLAB software. At the same time,the mowing characteristics of a roller blade were studied by investigating the relationship between the hob and the coordination of variables such as rotational speed and roller diameter with the mowing parameters. The parameter analysis of straight edge hob before and after structural parameter optimization is generated, and a design method is proposed based on this. After defining the objective function and constraint conditions, the influence of structural parameters on the power consumption and efficiency of the hob was determined by optimising the complex method; this could significantly adjust the hob parameters to lower its power consumption. The energy consumption of the optimized design is reduced by 11.1 % compared with the original scheme, and the optimization effect is remarkable. The results show that the best working parameters of the hob are cutting speed of 1000 r/min, sliding cutting angle and grinding edge angle of 25~30°. Moreover, practical tests demonstrated the feasibility of using the proposed method to design the straight edge hob to improve mowing performance and hob stability. This study can provide parameter foundation and an optimization method for lowering chopping power consumption of the roller mower blade. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Web Development and Performance Comparison of Web Development Frameworks: A Review Paper.
- Author
-
Vayadande, Kuldeep, Purohit, Shlok, Rathod, Chaitanya, Rathod, Manish, Rathi, Parth, and Rathi, Purvesh
- Subjects
WEB development ,HEALTH websites ,MATHEMATICAL optimization ,OPTICAL disks ,DECISION making ,SCALABILITY - Abstract
Web development frameworks help in streamlining and speedy-tracking the system of constructing web packages. There is an array of frameworks which makes it hard for developers to decide the nice option to use in their tasks. This survey paper is extensive and a comparative internet development framework, thinking about the primary additives, strengths and weaknesses of every utility. This paper involves stringent assessment technique for performance measurement metrics inclusive of reaction time, scalability, and memory. Optimization of techniques and the issues on protection for each framework also are taken into consideration. To wrap up, this paper offers steering to be able to help the readers to choose the most appropriate framework depending on their task needs. The data offered on this survey are supposed to guide developers as well as decision makers towards making quality decisions that culminate in appropriate utility on websites. [ABSTRACT FROM AUTHOR]
- Published
- 2024
6. Preventing Hot Spots in High Dose-Rate Brachytherapy
- Author
-
Morén, Björn, Larsson, Torbjörn, Tedgren, Åsa Carlsson, Kliewer, Natalia, editor, Ehmke, Jan Fabian, editor, and Borndörfer, Ralf, editor
- Published
- 2018
- Full Text
- View/download PDF
7. Mathematical Optimization of Design Parameters of Photovoltaic Module
- Author
-
Kubík, Dávid, Loebl, Jaroslav, Hutchison, David, Series Editor, Kanade, Takeo, Series Editor, Kittler, Josef, Series Editor, Kleinberg, Jon M., Series Editor, Mattern, Friedemann, Series Editor, Mitchell, John C., Series Editor, Naor, Moni, Series Editor, Pandu Rangan, C., Series Editor, Steffen, Bernhard, Series Editor, Terzopoulos, Demetri, Series Editor, Tygar, Doug, Series Editor, Weikum, Gerhard, Series Editor, Woon, Wei Lee, editor, Aung, Zeyar, editor, Catalina Feliú, Alejandro, editor, and Madnick, Stuart, editor
- Published
- 2018
- Full Text
- View/download PDF
8. Trim Loss Optimization in Paper Production Using Reinforcement Artificial Bee Colony
- Author
-
Santitham Prom-on, Booncharoen Sirinaovakul, Charoenchai Khompatraporn, and Suthida Fairee
- Subjects
Mathematical optimization ,General Computer Science ,Computer science ,swarm intelligence ,General Engineering ,Paper production ,Stock cutting ,Inventory cost ,Trim ,Artificial bee colony algorithm ,pulp and paper industry ,Cutting stock problem ,General Materials Science ,artificial bee colony algorithm ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,Reinforcement ,Integer programming ,optimization ,lcsh:TK1-9971 - Abstract
In paper production, a jumbo reel is cut into multiple intermediate rolls, and each intermediate roll is then sheeted as finished goods. This problem is called a cutting stock problem and is proven to be NP-hard. The objective is to minimize material waste or trim loss from all the cuttings. In the case that any intermediate roll is not entirely used for its associated order, the intermediate roll itself could turn to be a dead stock. We use the concept of universal sizes of intermediate rolls to eliminate the dead stock. A pre-defined number of universal sizes of intermediate rolls is to be used to serve all the orders. The problem is solved using Reinforcement Artificial Bee Colony algorithm with Integer Linear Programming subroutine. This proposed approach is then tested with a set of 1,055 orders and 127 different sizes of sheet papers from a paper manufacturer. The results reveal that our method outperforms other algorithms. Our method offers the total trim loss of 3.51%, compared to the trim loss reported by the industry of at least 5%. This approach not only reduces the number of partially cut rolls, but also decreases the number of the jumbo reels needed to serve all the orders. Therefore, both the inventory cost and material cost can be saved.
- Published
- 2020
9. The Collected Papers of Leonid Hurwicz : Volume 1
- Author
-
Samiran Banerjee and Samiran Banerjee
- Subjects
- Economics, Mathematical, Mathematical optimization
- Abstract
Leonid Hurwicz (1917-2008) was a major figure in modern theoretical economics whose contributions over sixty-five years spanned at least five areas: econometrics, nonlinear programming, decision theory, microeconomic theory, and mechanism design. In 2007, at age ninety, he received the Nobel Memorial Prize in Economics (shared with Eric Maskin and Roger Myerson) for pioneering the field of mechanism design and incentive compatibility. Hurwicz made seminal contributions in the other areas as well. In non-linear programming, he contributed to the understanding of Lagrange-Kuhn-Tucker problems (along with co-authors Kenneth Arrowand Hirofumi Uzawa). In econometrics, the Hurwicz bias in the least-squares analysis of time series is a fundamental and commonly cited benchmark. In decision theory, the Hurwicz criterion for decision-making under ambiguity is routinely invoked, sometimes without a citation since his original paper was never published. In microeconomic theory, Hurwicz (along with Arrow and H.D. Block) initiated the study of stability of the market mechanism, and (with Uzawa) solved the classic integrability of demand problem, a core result in neoclassical consumer theory. While some of Hurwicz's work were published in journals, many remain scattered as chapters in books which are difficult to access; yet others were never published at all. The Collected Papers of Leonid Hurwicz is the first volume in a series of four that will bring his oeuvre in one place, to bring to light the totality of his intellectual output, to document his contribution to economics and the extent of his legacy, with the express purpose to make it easily available for future generations of researchers to build upon.
- Published
- 2022
10. Trend and current practices of coagulation-based hybrid systems for pulp and paper mill effluent treatment: mechanisms, optimization techniques and performance evaluation.
- Author
-
Jagaba, Ahmad Hussaini, Birniwa, Abdullahi Haruna, Usman, Abdullahi Kilaco, Mu'azu, Nuhu Dalhat, Yaro, Nura Shehu Aliyu, Soja, Usman Bala, Abioye, Kunmi Joshua, Almahbashi, Najib Mohammed Yahya, Al-dhawi, Baker Nasser Saleh, Noor, Azmatullah, and Lawal, Ibrahim Mohammed
- Subjects
- *
WATER purification , *HYBRID systems , *PAPER pulp , *MATHEMATICAL optimization , *PULP mills , *SANITATION , *DECONTAMINATION (From gases, chemicals, etc.) - Abstract
This paper presents an overview of pulp and paper mills (PPM) production processes, the resulting release of wastewater effluent loaded with wide range of pollutants and associated environmental impacts. The review highlighted the different types of functional materials and their modified forms employed as coagulants for pulp and paper mills industries effluent (PPME) treatment that have been intensively studied as a promising strategy for PPM to achieve cleaner and sustainable treatments in accordance with sustainable development goals (SDGs) "6-Clean water and sanitation", "9-Industry, innovation, and infrastructure", and "12-Responsible consumption and production". Standalone coagulation treatment processes are inherently ineffective towards meeting the increasingly stringent discharge requirements, coupled with their higher energy demand, and increased operational and maintenance costs. Owing to the recalcitrant nature of PPME contaminants, this review explored the effectiveness of the coagulation processes for decontamination of PPME. Furthermore, the review provides a state-of-the-art coagulation-based hybrid systems employed for enhanced PPME treatment. The process limitations, influencing factors and optimization techniques are highlighted. The review also highlights how sustained research in the subject area impacts on achieving cleaner production. The review also discusses coagulant classifications and the synergistic, antagonistic and shock load toxic effects of hybrid coagulants on toxicant biodegradation and their associated system efficiency. Moreover, it offers a guide for the development and application of sustainable hybrid-based coagulants for PPME treatment. The findings presented herein provide a vital theoretical foundation for sustainable solutions to improve coagulation-based hybrid systems efficiency and their scale-up towards potential commercialization. [Display omitted] • Pulp and paper mills produce effluents containing diverse and emerging contaminants. • Different materials and their modified forms are used as coagulants for PPME treatment. • The synergistic and shock load toxic effects of hybrid-based coagulants could be identified. • Hybrid systems are highly efficient techniques for PPME treatment. • The review highlights process weaknesses, influencing factors and optimization techniques. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. Multiple-choice knapsack-based heuristic algorithm for the two-stage two-dimensional cutting stock problem in the paper industry.
- Author
-
Kim, Kyungdoc, Kim, Byung-In, and Cho, Hyunbo
- Subjects
CUTTING stock problem ,MATHEMATICAL optimization ,MATHEMATICAL models ,HEURISTIC algorithms ,KNAPSACK problems ,PAPER mills - Abstract
This study examines a two-stage two-dimensional cutting stock problem encountered by a paper mill company. The problem includes various machine-related and operational constraints based on real-world situations. Paper products are manufactured using two major cutting processes. Each cutting machine has a specific minimum and maximum width for input and output rolls and is limited by the maximum number of rolls it can cut at the same time. A mathematical model is presented to formally address the problem and an efficient multiple-choice knapsack-based heuristic algorithm is proposed to solve the problem. To demonstrate the efficiency of the proposed heuristic algorithm, computational experiments are conducted on test data-set generated from real-world data provided by a large paper mill company in the Republic of Korea. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
12. Performance Evaluation of Stochastic Model of a Paper Machine Having Three Types of Faults
- Author
-
Pooja Bhatia and Veena Rani
- Subjects
Mathematical optimization ,Paper machine ,business.product_category ,Computer science ,Stochastic modelling ,business - Published
- 2020
13. Development of Point-of-care Paper Based Strip for the Detection of Simple Antipyretic-analgesic Drugs
- Author
-
Srinivasa Rao and Gvsr Pavan Kumar
- Subjects
Mathematical optimization ,business.industry ,Analgesic ,Medicine ,Antipyretic ,Paper based ,business ,Simple (philosophy) ,medicine.drug ,Point of care - Published
- 2019
14. A note on the paper 'Necessary and sufficient optimality conditions using convexifactors for mathematical programs with equilibrium constraints'
- Author
-
N. Gadhi
- Subjects
Mathematical optimization ,Work (electrical) ,Computer science ,Management Science and Operations Research ,Fault (power engineering) ,Computer Science Applications ,Theoretical Computer Science ,Counterexample - Abstract
In this work, some counterexamples are given to refute some results in the paper by Kohli [RAIRO:OR 53 (2019) 1617–1632]. We correct the fault in some of his results.
- Published
- 2021
15. Machine Learning-Based Energy System Model for Tissue Paper Machines
- Author
-
Mengna Hong, Huanhuan Zhang, and Jigeng Li
- Subjects
0209 industrial biotechnology ,Mathematical optimization ,Computer science ,020209 energy ,Bioengineering ,Environmental pollution ,02 engineering and technology ,lcsh:Chemical technology ,pulp and paper ,Tissue paper ,energy system ,lcsh:Chemistry ,Computer Science::Hardware Architecture ,020901 industrial engineering & automation ,modeling and simulation ,energy consumption ,Linear regression ,0202 electrical engineering, electronic engineering, information engineering ,Chemical Engineering (miscellaneous) ,lcsh:TP1-1185 ,Computer Science::Operating Systems ,Consumption (economics) ,Artificial neural network ,Process Chemistry and Technology ,Energy consumption ,Tree (data structure) ,Mean absolute percentage error ,lcsh:QD1-999 - Abstract
With the global energy crisis and environmental pollution intensifying, tissue papermaking enterprises urgently need to save energy. The energy consumption model is essential for the energy saving of tissue paper machines. The energy consumption of tissue paper machine is very complicated, and the workload and difficulty of using the mechanism model to establish the energy consumption model of tissue paper machine are very large. Therefore, this article aims to build an empirical energy consumption model for tissue paper machines. The energy consumption of this model includes electricity consumption and steam consumption. Since the process parameters have a great influence on the energy consumption of the tissue paper machines, this study uses three methods: linear regression, artificial neural network and extreme gradient boosting tree to establish the relationship between process parameters and power consumption, and process parameters and steam consumption. Then, the best power consumption model and the best steam consumption model are selected from the models established by linear regression, artificial neural network and the extreme gradient boosting tree. Further, they are combined into the energy consumption model of the tissue paper machine. Finally, the models established by the three methods are evaluated. The experimental results show that using the empirical model for tissue paper machine energy consumption modeling is feasible. The result also indicates that the power consumption model and steam consumption model established by the extreme gradient boosting tree are better than the models established by linear regression and artificial neural network. The experimental results show that the power consumption model and steam consumption model established by the extreme gradient boosting tree are better than the models established by linear regression and artificial neural network. The mean absolute percentage error of the electricity consumption model and the steam consumption model built by the extreme gradient boosting tree is approximately 2.72 and 1.87, respectively. The root mean square errors of these two models are about 4.74 and 0.03, respectively. The result also indicates that using the empirical model for tissue paper machine energy consumption modeling is feasible, and the extreme gradient boosting tree is an efficient method for modeling energy consumption of tissue paper machines.
- Published
- 2021
16. Comments on 'A Note on the Paper 'Optimality Conditions for Optimistic Bilevel Programming Problem Using Convexifactors''
- Author
-
N. Gadhi
- Subjects
Mathematical optimization ,021103 operations research ,Control and Optimization ,Applied Mathematics ,0211 other engineering and technologies ,010103 numerical & computational mathematics ,02 engineering and technology ,Management Science and Operations Research ,01 natural sciences ,Bilevel optimization ,Bellman equation ,Theory of computation ,0101 mathematics ,Mathematics - Abstract
Necessary optimality conditions for a bilevel optimization problem are given in the paper by Kohli (J Optim Theory Appl 152: 632–651, 2012). Recently, the same author corrected his results in the note (J Optim Theory Appl 181:706–707, 2019). In this work, we have pointed out that some of the new modifications are wrong. We correct the flaws and present an alternative proof for the main result.
- Published
- 2021
17. Energy Saving for Tissue Paper Mills by Energy-Efficiency Scheduling under Time-of-Use Electricity Tariffs
- Author
-
Zhiqiang Zeng, Kaiyao Wang, and Xiaobin Chen
- Subjects
tissue paper mill ,Mathematical optimization ,Computer science ,0211 other engineering and technologies ,Scheduling (production processes) ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Bioengineering ,02 engineering and technology ,lcsh:Chemical technology ,Multi-objective optimization ,Tissue paper ,lcsh:Chemistry ,energy saving ,Chemical Engineering (miscellaneous) ,Mill ,lcsh:TP1-1185 ,021108 energy ,ComputingMethodologies_COMPUTERGRAPHICS ,021103 operations research ,Job shop scheduling ,business.industry ,Process Chemistry and Technology ,time-of-use electricity tariffs ,multi-objective optimization ,lcsh:QD1-999 ,Electricity ,Energy source ,business ,Efficient energy use - Abstract
Environmental concerns and soaring energy prices have brought huge pressure of energy saving and emission reduction to tissue paper mills. Electricity is one of the main energy sources of tissue paper mills. The production characteristics of tissue paper mills make it easy to decrease energy cost by using time-of-use (TOU) electricity tariffs. This study investigates the bi-objective energy-efficiency scheduling of tissue paper mills under time-of-use electricity tariffs, the objectives of which are makespan and energy cost. First, considering the processing energy cost, setup energy cost, and transportation energy cost, an energy cost model of a tissue paper mill under TOU electricity tariffs is established. Second, the energy-efficiency scheduling model under TOU electricity tariffs is built based on the energy cost model. Finally, on the basis of decomposition and teaching&ndash, learning optimization, this study proposes a novel multi-objective evolutionary algorithm and further combined with the variable neighborhood search to solve the problem. The case study results demonstrate that our study of tissue paper mill energy saving is feasible, and the proposed method has better performance than the existing methods.
- Published
- 2021
18. Some Remarks on The Paper 'Global Optimization in Metric Spaces With Partial Orders'
- Author
-
Jack Markin and Moosa Gabeleh
- Subjects
Mathematical optimization ,Metric space ,Regular polygon ,Global optimization ,Mathematics - Published
- 2021
19. On Estimating Maximum Sum Rate of MIMO Systems with Successive Zero-Forcing Dirty Paper Coding and Per-antenna Power Constraint
- Author
-
Le-Nam Tran, Thuy M. Pham, and Ronan Farrell
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Mathematical optimization ,Computer science ,Computer Science - Information Theory ,Information Theory (cs.IT) ,MIMO ,020302 automobile design & engineering ,020206 networking & telecommunications ,Machine Learning (stat.ML) ,02 engineering and technology ,Precoding ,Machine Learning (cs.LG) ,0203 mechanical engineering ,Statistics - Machine Learning ,0202 electrical engineering, electronic engineering, information engineering ,Zero Forcing Equalizer ,Dirty paper coding ,Communication channel ,Mimo systems ,Computer Science::Information Theory - Abstract
In this paper, we study the sum rate maximization for successive zero-forcing dirty-paper coding (SZFDPC) with per-antenna power constraint (PAPC). Although SZFDPC is a low-complexity alternative to the optimal dirty paper coding (DPC), efficient algorithms to compute its sum rate are still open problems especially under practical PAPC. The existing solution to the considered problem is computationally inefficient due to employing high-complexity interior-point method. In this study, we propose two new low-complexity approaches to this important problem. More specifically, the first algorithm achieves the optimal solution by transforming the original problem in the broadcast channel into an equivalent problem in the multiple access channel, then the resulting problem is solved by alternating optimization together with successive convex approximation. We also derive a suboptimal solution based on machine learning to which simple linear regressions are applicable. The approaches are analyzed and validated extensively to demonstrate their superiors over the existing approach., Comment: 5 pages, 4 figures
- Published
- 2019
20. Constrained Communication Over the Gaussian Dirty Paper Channel
- Author
-
Guojun Chen, Yinfei Xu, and Jian Lu
- Subjects
Mathematical optimization ,Computer science ,Gaussian ,020206 networking & telecommunications ,Data_CODINGANDINFORMATIONTHEORY ,02 engineering and technology ,Upper and lower bounds ,Dirty paper ,symbols.namesake ,0202 electrical engineering, electronic engineering, information engineering ,symbols ,Joint (audio engineering) ,Computer Science::Information Theory ,Communication channel - Abstract
The problem of joint information transmission and input signalling estimation over a state-dependent channel is considered. The general single-letter upper and lower bounds of the optimal capacity-distortion trade-offs are provided. For the Gaussian dirty paper channel, we calculate the bounds by choosing parameters carefully, and obtain a computable form eventually. In such a way, we obtain a complete characterization for this problem setting.
- Published
- 2019
21. FORMULATION AND DEVELOPMENT OF EFAVIRENZ TABLETS BY PAPER TECHNIQUE USING CO-SOLVENCY METHOD
- Author
-
Y. Prasanth, Ch. Dhana Subrahmanyeswari, and Sameeda Rubeen
- Subjects
Mathematical optimization ,chemistry.chemical_compound ,Solvency ,Efavirenz ,chemistry ,Pharmaceutical Science ,Mathematics - Abstract
Objective: The present study is to formulate and development of efavirenz tablets by paper technique using the co-solvency method, the drug is antiviral drug used for the treatment of HIV. Methods: In this 7 formulation (F1-F7) were prepared by using different tissue papers like kitchen roll paper, hand kercheif paper, facial tissue paper, with different weights. The prepared tablets were evaluated for hardness, friability, thickness, content uniformity, disintegration time and in vitro dissolution study. Results: Among all the formulations, F2 (kicthen roll paper with weight 250 mg) was consired to be the best formulation, which release up to 98.02% drug in 3 h. The results of stability studies of formulation F2 after a period of 2 mo indicated that the formulation was stable. Conclusion: It was concluted that a paper tablet of efavirenz shows better results and it does not contain any excipient and increase the dissolution rate.
- Published
- 2019
22. A hybrid collaborative algorithm to solve an integrated wood transportation and paper pulp production problem
- Author
-
José Eduardo Pécora Junior, Angel Ruiz, and Patrick Soriano
- Subjects
Marketing ,Mathematical optimization ,021103 operations research ,Linear programming ,Computer science ,Heuristic ,Strategy and Management ,Computation ,Pulp (paper) ,0211 other engineering and technologies ,Time horizon ,02 engineering and technology ,Management Science and Operations Research ,engineering.material ,Hybrid algorithm ,Management Information Systems ,Robustness (computer science) ,0202 electrical engineering, electronic engineering, information engineering ,engineering ,020201 artificial intelligence & image processing ,Heuristics - Abstract
This paper proposes a hybrid algorithm to tackle a real-world problem arising in the context of pulp and paper production. This situation is modelled as a production problem where one has to decide which wood will be used by each available processing unit (wood cooker) in order to minimize the variance of wood densities within each cooker for each period of the planning horizon. The proposed hybrid algorithm is built around two distinct phases. The first phase uses two interacting heuristic methods to identify a promising reduced search space, which is then thoroughly explored in the second phase. This hybrid algorithm produces high-quality solutions in reasonable computation times, especially for the largest test instances. Extensive computational experiments demonstrated the robustness and efficiency of the method.
- Published
- 2016
23. Design of robust distribution network under demand uncertainty: A case study in the pulp and paper
- Author
-
Mustapha Ouhimmou, Mustapha Nourelfath, Mathieu Bouchard, and Naji Bricha
- Subjects
Economics and Econometrics ,Iterative and incremental development ,Mathematical optimization ,021103 operations research ,Distribution networks ,Computer science ,Total cost ,business.industry ,05 social sciences ,0211 other engineering and technologies ,Robust optimization ,02 engineering and technology ,Management Science and Operations Research ,General Business, Management and Accounting ,Industrial and Manufacturing Engineering ,Outsourcing ,Robustness (computer science) ,0502 economics and business ,Supply chain network ,business ,050203 business & management - Abstract
The design of a supply chain network helps companies in dealing with variability and uncertain evolution of demand over time. An efficient supply chain network may contribute to fulfill the customers’ demands in a quick and least cost manner. Therefore, it is important to solve the problem dealt with in this article concerning the design of the distribution network under demand uncertainty. The problem is to determine which warehouses to open and how much space to rent (outsource) in warehouses owned by third-party logistics providers. This paper presents the development and application of the robust optimization methodology to distribution network design problem under demand uncertainty. The proposed method allows the designer to find a network configuration having a total cost that is robust to typical changes in the geographical distribution of the demand. The Algorithm is an iterative process based on Benders decomposition. At each iteration, the following two steps are performed. In the first step, the global design problem (master problem) is solved to decide on the best use of warehouses according to the information provided by the previous iterations. For a given warehouse configuration and under some restrictions on demand variations, the second step determines the demand that incurred the largest transportation cost, granted that the transportation cost is optimal. These steps are repeated until finding the warehouses configuration that gives the smallest worst-case transportation cost. At each iteration the worst-case transportation cost sub-problem provides new information to the global design problem, such that the latter can improve its robustness. We report numerical results for real size network problems. The main results show that a high level of robustness of the distribution network can be achieved at a relatively low cost.
- Published
- 2019
24. Topological optimization of the stiffness of an irregular structure based on an element size independent filter.
- Author
-
Diao, Shijing, Wang, Deshi, and Wang, Xudong
- Subjects
- *
STRUCTURAL optimization , *FILTER paper , *CONSTRUCTION materials , *MATHEMATICAL optimization - Abstract
Because of the overly averaged element sensitivity in the topological optimization of an irregular structure with a grid independent filter, a topological optimization model was built for the structural domain. The maximization of stiffness was first taken as the goal for the topological optimization of irregular structure stiffness. Subsequently, an element size filter was proposed to address the overly averaged local element sensitivity with the grid independent filter when the designed domain element size varied dramatically. Finally, the element sensitivity of the objective function was derived under the given constraints. A case study was then conducted on a naval gun mount with the maximization of structural flexibility as the objective function and the volume of structural material as a constraint. A stiffness optimization model based on the bi-directional evolutionary structural optimization algorithm was adopted for the topological optimization of the gun mount. Structural optimization was conducted for the gun mount with different shooting angles to realize its optimal stiffness and strength under the constraint of consistent material volume. The optimization results proved that the element independent filter proposed in this paper could be effectively applied in the topological optimization of an irregular structure and used to explore the topological optimization of the supporting structure under impact. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
25. The UAM service network: multi-objective and multi-period design for UAM airports
- Author
-
Boo, Jeongjoon, Lee, Seung Yeob, and Song, Byung Duk
- Published
- 2023
- Full Text
- View/download PDF
26. Preface for the Special Issue Optimization, Variational Analysis, and Applications in Honor of Professor Franco Giannessi.
- Author
-
Ansari, Qamrul Hasan, Mordukhovich, Boris S., and Pappalardo, Massimo
- Subjects
SIMPLEX algorithm ,LINEAR complementarity problem ,MATHEMATICAL optimization ,CONTACT mechanics ,LIPSCHITZ continuity ,COMPLEMENTARITY constraints (Mathematics) - Abstract
The numerical algorithm by Cristofari et al. modifies the augmented Lagrangian method ALGENCAN proposed by Andreani and his collaborators by incorporating certain second-order information into the augmented Lagrangian framework. Professor Franco Giannessi, University of Pisa, is an outstanding mathematician whose contributions to optimization theory and its applications and to the world optimization community are difficult to overstate. The paper by Izmailov and Solodov introduces and develops a novel perturbed augmented Lagrangian method framework for constrained optimization problems. [Extracted from the article]
- Published
- 2022
- Full Text
- View/download PDF
27. Reliability-Optimal Designs in MEC Networks with Finite Blocklength Codes and Outdated CSI: (Invited Paper)
- Author
-
M. Cenk Gursoy, Yulin Hu, and Yang Yang
- Subjects
Reliability theory ,Mathematical optimization ,Base station ,Mobile edge computing ,Optimization problem ,Computer science ,Reliability (computer networking) ,Server ,Computational resource ,Edge computing - Abstract
There has been an increasing interest in mobile edge computing (MEC) in recent years. Different from the traditional centralized cloud computing, MEC servers are deployed at the edges of networks such as at base stations (BSs) and access points (APs), in order to support computation-intensive and latency-critical applications. In this paper, we consider a multi-user mobile edge computing (MEC) network in which the wireless data transmission/offloading is performed using finite blocklength (FBL) codes to satisfy the latency constraints. The reliability in the communication phase is characterized in the FBL regime, while the event of queue length violation in the computation phase is investigated via exploiting the extreme value theory. We first formulate the overall optimization problem in the scenario of multiple user equipments (UEs) aiming to minimize the maximal end-to-end error probability among all UEs under both the FBL and energy consumption constraints. We further propose a two-level learning-based approach to jointly determine time allocations in the FBL regime, the UEs' offloading decisions and MEC computational resource allocations to solve the overall optimization problem. Simulation results demonstrate that the proposed two-level learning-based algorithm solves the problem efficiently.
- Published
- 2021
28. Boosting symbolic execution via constraint solving time prediction (experience paper)
- Author
-
Sicheng Luo, Hui Xu, Yangfan Zhou, Yanxiang Bi, and Xin Wang
- Subjects
Constraint (information theory) ,Set (abstract data type) ,Mathematical optimization ,Boosting (machine learning) ,Computer science ,Scalability ,Process (computing) ,The Symbolic ,Timeout ,Symbolic execution - Abstract
Symbolic execution is an essential approach for automated test case generation. However, the approach is generally not scalable to large programs. One critical reason is that the constraint solving problems in symbolic execution are generally hard. Consequently, the symbolic execution process may get stuck in solving such hard problems. To mitigate this issue, symbolic execution tools generally rely on a timeout threshold to terminate the solving. Such a timeout is generally set to a fixed, predefined value, e.g., five minutes in angr. Nevertheless, how to set a proper timeout is critical to the tool’s efficiency. This paper proposes an approach to tackle the problem by predicting the time required for solving a constraint model so that the symbolic execution engine could base on the information to determine whether to continue the current solving process. Due to the cost of the prediction itself, our approach triggers the predictor only when the solving time has exceeded a relatively small value. We have shown that such a predictor can achieve promising performance with several different machine learning models and datasets. By further employing an adaptive design, the predictor can achieve an F1-score ranging from 0.743 to 0.800 on these datasets. We then apply the predictor to eight programs and conduct simulation experiments. Results show that the efficiency of constraint solving for symbolic execution can be improved by 1.25x to 3x, depending on the distribution of the hardness of their constraint models.
- Published
- 2021
29. Radial basis neural tree model for improving waste recovery process in a paper industry
- Author
-
Tanujit Chakraborty, Ashis Kumar Chakraborty, and Swarup Chattopadhyay
- Subjects
Mathematical optimization ,bepress|Engineering|Operations Research, Systems Engineering and Industrial Engineering|Industrial Engineering ,Radial basis function network ,Basis (linear algebra) ,bepress|Engineering ,Process (engineering) ,Computer science ,Decision tree ,Management Science and Operations Research ,engrXiv|Engineering|Operations Research, Systems Engineering and Industrial Engineering ,General Business, Management and Accounting ,bepress|Engineering|Operations Research, Systems Engineering and Industrial Engineering ,engrXiv|Engineering ,bepress|Engineering|Mechanical Engineering|Manufacturing ,Modeling and Simulation ,Waste recovery ,engrXiv|Engineering|Operations Research, Systems Engineering and Industrial Engineering|Industrial Engineering ,Decision tree model ,engrXiv|Engineering|Manufacturing Engineering - Abstract
In this article, we propose a novel hybridization of regression trees (RT) and radial basis function networks (RBFN), namely, radial basis neural tree (RBNT) model,for waste recovery process improvement in the paper industry. As a by-product of the paper manufacturing process, a lot of waste along with valuable fibers and fillerscome out from the paper machine. The waste recovery process (WRP) involves separating the unwanted materials from the valuable ones so that the recovered fibersand fillers can be further reused in the production process. This job is done by fiber-filler recovery equipment (FFRE). The efficiency of FFRE depends on severalcrucial process parameters and monitoring them is a difficult proposition. The proposed model can be useful to find the essential parameters from the set of availabledata and perform prediction task to improve waste recovery process efficiency. An idea of parameter optimization along with regularity conditions for the universal consistency of the proposed model are given. The proposed model has the advantages of easy interpretability and excellent performance when applied to the FFREefficiency improvement problem. Improved waste recovery will help the industry to become environmentally friendly with less ecological damage apart from being cost-effective.
- Published
- 2019
30. Development of Decision Support System for a Paper Making Unit of a Paper Plant Using Genetic Algorithm Technique
- Author
-
Mridul Sharma and Rajeev Khanduja
- Subjects
Mathematical optimization ,Decision support system ,Markov chain ,Differential equation ,Computer science ,Transition diagram ,Sugar industry ,Probabilistic logic ,Performance model - Abstract
In this paper, decision support system (DSS) for a paper making unit of a paper plant has been developed by using genetic algorithm. Process industries like paper plant, sugar industry, fertilizer, etc., are comprised of various complex engineering units. A paper plant comprises chipping, digesting, washing, bleaching, screening, stock preparation, paper making units, etc. These units are connected in combined configuration. The paper making unit is having four main subsystems. The mathematical model of paper making unit has been developed by using Markov birth–death process. On the basis of probabilistic approach, the differential equations are formed by using of transition diagram. These equations are solved recursively and then reduced to steady-state conditions, mostly required for the paper industry. Normalizing condition has been used to find out the probability of full working state with or without the use of any standby subsystems, and finally performance model has been developed for paper making unit. Finally, the genetic algorithm technique is used for optimum value of the unit performance by coordinating the parameters of all subsystems of paper making unit.
- Published
- 2020
31. A Hybrid FLP-AHP Approach for Optimal Product Mix in Pulp and Paper Industry
- Author
-
Meenu Singh and Millie Pant
- Subjects
Product mix ,Range (mathematics) ,Mathematical optimization ,Production planning ,Profit (accounting) ,Ranking ,GSM ,Computer science ,Order (business) ,Analytic hierarchy process - Abstract
Pulp and Paper Industries (PPI) manufactures a wide range of papers based on three different GSM (Grams/sq. meter). i.e., lower GSM, middle GSM and higher GSM. In order to maximize the profit, the PPI must efficiently utilize its available resources thereby producing optimal units of three different GSMs. Such problems lie under the category of product mix problems and forms an important part of production planning for every process industry like paper mill. In the present study, this problem is represented as a Fuzzy Linear Programming (FLP) model, to include the inherent vagueness and uncertainties. The solutions obtained through FLP are further refined with the help of AHP (Analytical Hierarchy Process) to determine the most profitable solution. Results indicate that ranking results obtained by integrating AHP into FLP may help in providing a better guidance to the Decision Maker (DM) for determining an optimal product mix.
- Published
- 2021
32. Pattern Reduction in Paper Cutting
- Author
-
H. Tuenter, C. McDiarmid, M. Shepherd, S.J. Chapman, R. Leese, C. Aldridge, R. Gower, H. Wilson, and A. Zinober
- Subjects
Set (abstract data type) ,Reduction (complexity) ,Mathematical optimization ,Secondary problem ,Linear programming ,Series (mathematics) ,Computer science ,Order (business) ,Cutting stock problem ,Heuristics - Abstract
A large part of the paper industry involves supplying customers with reels of specified width in specified quantities. These 'customer reels' must be cut from a set of wider 'jumbo reels' as economically as possible. The first priority is to satisfy the customer demands using as few jumbo reels as possible. This is an example of the well-known one-dimensional cutting stock problem, which can be solved by creating a series of patterns, each corresponding to a different set of reels that can be cut from a single jumbo. A secondary problem is to minimize the number of patterns needed in order to reduce frequency of cutting-knife resets. Existing methods account for 90% of cases. The problem is to provide methods for the remaining 10%. This is shown to be an NP-hard problem, and so efficient design heuristics are proposed and tested for reducing further the frequency of resets.
- Published
- 2021
33. The Power of Parallelism in Stochastic Search for Global Optimum: Keynote Paper
- Author
-
Ali Sheikholeslami
- Subjects
symbols.namesake ,Mathematical optimization ,education.field_of_study ,Optimization problem ,Local optimum ,Population ,Monte Carlo method ,symbols ,Parallelism (grammar) ,Markov chain Monte Carlo ,Hamming distance ,Parallel tempering ,education - Abstract
We explore the power of parallelism in a stochastic search for global optimum in high-dimensional optimization problems. This search, that often navigates a large solution space through a Markov-Chain Monte Carlo (MCMC) process, needs to make stochastic decisions at every step and needs to escape from local optima. Through parallelism, we are able to survey an entire neighbourhood (all states with Hamming distance of 1) to make efficient moves, use multiple replicas at different temperatures, such as in parallel tempering, or deploy a population of replicas at the same temperature. Once combined, these methods of parallelism can yield 100x to 10,000x speedups.
- Published
- 2021
34. Simultaneous analysis and design based optimization for paper path and timing design of a high-volume printer
- Author
-
J.M. van de Mortel-Fronczak, Jacobus E. Rooda, Lfp Pascal Etman, Lou Somers, L. Swartjes, and Control Systems Technology
- Subjects
Imagination ,Optimization ,0209 industrial biotechnology ,Mathematical optimization ,Optimization problem ,Computer science ,Computation ,media_common.quotation_subject ,02 engineering and technology ,020901 industrial engineering & automation ,0202 electrical engineering, electronic engineering, information engineering ,Paper path ,Printer ,Electrical and Electronic Engineering ,Dimensioning ,Simulation ,media_common ,Mechanical Engineering ,Volume (computing) ,Optimal control ,020202 computer hardware & architecture ,Computer Science Applications ,Control and Systems Engineering ,SAND ,Path (graph theory) ,Model - Abstract
The design of a high-volume printer for professional use is rather complex. The design of the paper path and the timing of sheets is frequently reengineered as the design of the printer components progresses. This paper presents an optimization model for the combined paper path and timing design problem. The paper path is an optimal physical dimensioning problem, while the timing is an open-loop optimal control problem. The coupled optimization problem is formulated as a simultaneous analysis and design (SAND) problem using a direct transcription of the optimal control problem. Benefits of the chosen formulation for industrial application are the ease of setting up the optimization model for arbitrary printer configurations, and the short computation times. Results of an industrial case are presented.
- Published
- 2017
35. Optimization of an Integrated Lot Sizing and Cutting Stock Problem in the Paper Industry
- Author
-
Sônia Cristina Poltroniere, Kelly Cristina Poldi, Silvio Alexandre de Araujo, Universidade Estadual Paulista (Unesp), and Universidade Estadual de Campinas (UNICAMP)
- Subjects
0209 industrial biotechnology ,Mathematical optimization ,Optimization problem ,Process (engineering) ,Computer science ,0211 other engineering and technologies ,Time horizon ,02 engineering and technology ,020901 industrial engineering & automation ,Order (exchange) ,Production (economics) ,cutting stock problem ,problema de dimensionamento de lotes ,021103 operations research ,Mathematical model ,lcsh:Mathematics ,Problema integrado ,lot sizing problem ,lcsh:QA1-939 ,problema de corte de estoque ,Sizing ,paper industry ,Cutting stock problem ,indústria de papel ,integrated problem - Abstract
Made available in DSpace on 2021-07-14T10:36:19Z (GMT). No. of bitstreams: 0 Previous issue date: 2016. Added 1 bitstream(s) on 2021-07-14T11:34:04Z : No. of bitstreams: 1 S2179-84512016000300305.pdf: 151242 bytes, checksum: 76a9182b1eaa42a52072ff191dcf3917 (MD5) Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq) Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP) Dois importantes problemas de otimização combinatória ocorrem no planejamento da produção em indústrias papeleiras: o problema de dimensionamento de lotes e o problema de corte de estoque multiperíodo. O problema de dimensionamento de lotes deve determinar a quantidade de bobinas jumbos de diferentes tipos de papel (gramaturas) a serem produzidos em cada máquina, ao longo de um horizonte de planejamento finito. Estes jumbos são então cortados para atender a demanda de itens para cada período. Neste trabalho, tratamos da integração desses dois problemas, procurando minimizar custos com produção e estoque dos jumbos, como também a perda de papel durante o processo de corte. Duas modelagens matemáticas para o problema integrado foram consideradas, e os modelos foram resolvidos heuristicamente usando um pacote de otimização. Procurando obter limitantes inferiores para o problema, foram resolvidas versões relaxadas dos modelos. Finalmente, experimentos computacionais são apresentados e discutidos. Two important optimization problems occur in the planning and production scheduling inpaper industries: the lot sizing problem and the cutting stock problem. The lot sizing problem must determine the quantity of jumbos of different types of paper to be produced in each machine over a finite planning horizon. These jumbos are then cut in order to meet the demand of items for each period. In this paper, we deal with the integration of these two problems, aiming to minimize costs of production and inventory of jumbos, as well as the trim loss of paper generated during the cutting process. Two mathematical models for the integrated problem are considered, and these models are solved both heuristically and using an optimization package. Attempting to get lower bounds for the problem, relaxed versions of the models also have been solved. Finally, computational experiments are presented and discussed. Universidade Estadual Paulista Júlio de Mesquita Filho, Faculdade de Ciências Universidade Estadual Paulista Júlio de Mesquita Filho, Departamento de Matemática Aplicada Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação Científica Universidade Estadual Paulista Júlio de Mesquita Filho, Faculdade de Ciências Universidade Estadual Paulista Júlio de Mesquita Filho, Departamento de Matemática Aplicada CNPq: 2010/10133-0; 2013/07375-0 FAPESP: 2010/10133-0; 2013/07375-0
- Published
- 2016
36. Survey Paper: Whale optimization algorithm and its variant applications
- Author
-
Basu Dev Shivahare, Amardeep Gupta, Manasees Singh, Deepak Pareta, Biswa Mohan Sahu, and Shivam Ranjan
- Subjects
Statistical classification ,Mathematical optimization ,Local optimum ,Optimization problem ,Computer science ,Image segmentation ,Minification ,Maximization ,Cluster analysis ,Metaheuristic - Abstract
Whale Optimization Algorithm (WOA) proposed by Seyedali Mirjalili and Andrew Lewis in 2016 is popular and powerful metaheuristic algorithm to search the global solution of optimization problems. WOA is nature-inspired, metaheuristic (randomization and deterministic) algorithm, which has been widely used to solve various single objective, multi objective and multidimensional optimization problems. WOA and its variant have been introduced in engineering applications, bioinformatics, multi-level image segmentation, clustering applications, design of low pass filter, Email classification, Diabetes classification, heterogeneous networks, machine learning etc. WOA is gradient free, easy to represent, capable to explore, exploit the search space and able to avoid local optima. This paper presents overview of WOA, its variants and applications. The performance of WOA is enhanced by introducing hybridization of other methods with WOA such as WOA-PSO, WOA-Levy, WOA-BAT, WOA-ANN, WOA-SVM etc. Objective of metaheuristic algorithm is tofind best position or leader position X* which is near to optimal solution for target prey over successive iteration. Objective function could be based on minimization or maximization approach.
- Published
- 2021
37. An Efficient Framework For Fast Computer Aided Design of Microwave Circuits Based on the Higher-Order 3D Finite-Element Method (Invited Paper).
- Author
-
LAMECKI, Adam, BALEWSKI, Lukasz, and MROZOWSKI, Michal
- Subjects
COMPUTER-aided design ,FINITE element method ,MICROWAVE circuits ,COMPUTATIONAL physics ,MATHEMATICAL optimization ,DEFORMATIONS (Mechanics) - Abstract
In this paper, an efficient computational framework for the full-wave design by optimization of complex microwave passive devices, such as antennas, filters, and multiplexers, is described. The framework consists of a computational engine, a 3D object modeler, and a graphical user interface. The computational engine, which is based on a finite element method with curvilinear higher-order tetrahedral elements, is coupled with built-in or external gradient-based optimization procedures. For speed, a model order reduction technique is used and the gradient computation is achieved by perturbation with geometry deformation, processed on the level of the individual mesh nodes. To maximize performance, the framework is targeted to multicore CPU architectures and its extended version can also use multiple GPUs. To illustrate the accuracy and high efficiency of the framework, we provide examples of simulations of a dielectric resonator antenna and full-wave design by optimization of two diplexers involving tens of unknowns, and show that the design can be completed within the duration of a few simulations using industry-standard FEM solvers. The accuracy of the design is confirmed by measurements. [ABSTRACT FROM AUTHOR]
- Published
- 2014
38. Commentary on 'A reply to a Note on the paper 'A simplified novel technique for solving fully fuzzy linear programming problems''
- Author
-
Arshdeep Kaur, Ajay Kumar, and Srimantoorao S. Appadoo
- Subjects
Statistics and Probability ,Novel technique ,Mathematical optimization ,Artificial Intelligence ,Computer science ,General Engineering ,Fuzzy linear programming - Published
- 2019
39. An Integrated Slacks-Based Measure of Super-Efficiency with Input Saving and Output Surplus Scaling Factors and its Application in Paper Chemical Mills
- Author
-
Dong Guo and Zheng-Qun Cai
- Subjects
Mathematical optimization ,021103 operations research ,Article Subject ,Chemistry ,0211 other engineering and technologies ,Nonparametric statistics ,Efficient frontier ,02 engineering and technology ,General Chemistry ,Function (mathematics) ,Measure (mathematics) ,Empirical research ,0202 electrical engineering, electronic engineering, information engineering ,Data envelopment analysis ,020201 artificial intelligence & image processing ,Projection (set theory) ,Scaling ,QD1-999 - Abstract
Data envelopment analysis (DEA) as a nonparametric programming approach has been widely extended and applied in many areas. Conventional DEA models can well measure the efficiency of inefficient decision-making units (DMUs) but cannot further discriminate the efficient DMUs. A lot of methods are proposed to address this problem. One of the most important methods is the slacks-based measure of super-efficiency model (S-SBM model) developed by Tone in 2002. However, the projection for a DMU on the efficient frontier identified by S-SBM model may not be strongly Pareto-efficient that makes the super-efficiency score misestimated. This paper revises the usual slacks-based measure of super-efficiency by incorporating input saving and output surplus scaling factors into the objection function for measuring DMUs. We integrate SBM model and S-SBM model effectively and yield input saving and output surplus scaling factors as well as input and output slacks under only one integrated model. According to the study, the projection reference point identified by our method is strongly Pareto-efficient. Meanwhile, how each decision variable influences the efficiency score for a specific DMU is revealed and illustrated through two numerical examples and an empirical study in paper chemical mills.
- Published
- 2020
40. A note on the paper 'Sufficient optimality conditions using convexifactors for optimistic bilevel programming problem'
- Author
-
N. Gadhi
- Subjects
0209 industrial biotechnology ,Mathematical optimization ,021103 operations research ,Control and Optimization ,Computer science ,Applied Mathematics ,Strategy and Management ,0211 other engineering and technologies ,02 engineering and technology ,Mathematical proof ,Bilevel optimization ,Atomic and Molecular Physics, and Optics ,020901 industrial engineering & automation ,Work (electrical) ,Bellman equation ,Business and International Management ,Electrical and Electronic Engineering - Abstract
In this work, some reasoning's mistakes in the paper by Kohli (doi:10.3934/jimo.2020114) are highlighted. Furthermore, we correct the flaws, propose a correct formulation of the main result (Theorem 5.1) and give alternative proofs.
- Published
- 2022
41. First-Order Methods for Energy-Efficient Power Control in Cell-Free Massive MIMO : Invited Paper
- Author
-
Le-Nam Tran and Hien Quoc Ngo
- Subjects
Beamforming ,Mathematical optimization ,Optimization problem ,Computer science ,MIMO ,020206 networking & telecommunications ,02 engineering and technology ,010501 environmental sciences ,01 natural sciences ,Antenna array ,0202 electrical engineering, electronic engineering, information engineering ,Resource allocation ,Communication complexity ,Computer Science::Information Theory ,0105 earth and related environmental sciences ,Power control ,Efficient energy use - Abstract
This paper considers a cell-free massive MIMO system with multiple-antenna access points and single-antenna users. The APs use conjugate beamforming to beamform the data to all users in the network. Total energy efficiency maximization is investigated. This optimization problem is nonconvex and thus difficult to solve. Existing solutions are based on second-order optimization methods in connection with convex approximations. These methods have been shown to perform very well but their complexity does not scale favorably with the network size. To tackle this issue, in this paper we propose to use a first-order method for nonconvex programming to our energy efficiency problem. Compared to the second-order methods, the proposed method achieves the same performance, while its run time is much faster. Thus, it is considered as a feasible solution for resource allocation in cell-free massive MIMO systems.
- Published
- 2019
42. Fuzzy Portfolio Optimization of Onshore Wind Power Plants.
- Author
-
Madlener, Reinhard, Glensk, Babara, and Weber, Veronika
- Subjects
WIND power plants ,ELECTRIC power production ,FUZZY sets ,DECISION making ,MATHEMATICAL optimization - Abstract
In this paper we apply fuzzy set theory to the portfolio optimization of power generation assets, using a semi-mean absolute deviation (SMAD) model as a benchmark and a fuzzy semi-mean absolute deviation (FSMAD) model for comparison. The two models are applied to five onshore wind power plants in Germany considered for the portfolio analysis. The results show that the combinations of favorable assets for efficient portfolios are very similar, although the portfolio shares are markedly different. Also, the return and risk span of the SMAD model are much broader than those of the FSMAD model. The highest returns are generated by portfolios based on the latter model. Offering less portfolio choices, the FSMAD model thus facilitates decision-making. This is in compliance with the notion that portfolio optimization by fuzzy set theory is able to better account for the decisionmaker's preferences under real-world conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2021
43. Data-driven approach of scheduling the ratio of waste paper and pulp properties prediction
- Author
-
Shen Wen-hao and Liu Zhang
- Subjects
Engineering ,Mathematical optimization ,business.industry ,Pulp (paper) ,Scheduling (production processes) ,Paper mill ,02 engineering and technology ,010501 environmental sciences ,engineering.material ,Machine learning ,computer.software_genre ,01 natural sciences ,Data-driven ,Support vector machine ,Genetic algorithm ,0202 electrical engineering, electronic engineering, information engineering ,Mill ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,computer ,Predictive modelling ,0105 earth and related environmental sciences - Abstract
The study aimed to realize the automatic production scheduling of waste paper in paper mills. Based on the quantities of the field data (mixing ratios of waste paper and pulp properties) stored in a paper mill, the methods of back propagation neural networks (BP-NN) and support vector machine (SVM), the all data set and the average data set, were firstly applied to develop the prediction models of the pulp properties; and then the genetic algorithm was used to search for the suitable scheduling solutions. The simulation results revealed that: demonstrating its acceptable prediction accuracy and fast training time, the prediction model with SVM method and the average data set was good enough to be used in the genetic algorithm to search for the optimal scheduling ratios of waste paper, the obtained optimal scheduling solution not only had the lowest purchase cost of waste paper, but also met the requirements of the mill: specified types of waste paper, specified ratio of some kind of waste paper and the required pulp brightness.
- Published
- 2016
44. An effective soft computing technology based on belief-rule-base and particle swarm optimization for tipping paper permeability measurement
- Author
-
Bin Qian, Qian Qian Wang, Rong Hu, Chuan Qiang Yu, Zhi Jie Zhou, and Zhiguo Zhou
- Subjects
Structure (mathematical logic) ,Soft computing ,0209 industrial biotechnology ,Mathematical optimization ,Optimization problem ,General Computer Science ,Computer science ,Particle swarm optimization ,Computational intelligence ,02 engineering and technology ,Base (topology) ,Nonlinear system ,020901 industrial engineering & automation ,0202 electrical engineering, electronic engineering, information engineering ,Factory (object-oriented programming) ,020201 artificial intelligence & image processing - Abstract
This paper proposes a soft computing technology based on belief rule base (BRB) system for the tipping paper permeability measurement in tobacco factory. In current studies about BRB, both the referential values of the antecedent attributes and the utilities of the consequents are given in advance and are not trained by using the dedicated optimal algorithms. The limitations of expert knowledge may lead to error of BRB because both the referential values and the utilities make a real difference on the structure of BRB, and the appropriate structure is helpful for tuning parameters more accurately. Therefore, this paper focuses on the structure and parameters optimization of BRB (SPO-BRB) by taking the referential values of the antecedent attributes and the utilities of the consequents into account to improve the input–output modeling ability of BRB. However, SPO-BRB is a nonlinear nonconvex optimization problem (NNOP). To deal with the NNOP of SPO-BRB, a particle swarm optimization algorithm with improved velocity update way and repair methods (PSO_VR) is proposed. A case study based on the data collected from a tobacco factory of china is carried out. The test results demonstrate the functionality of SPO-BRB and the effectiveness of PSO_VR.
- Published
- 2017
45. Automated paper impurities evaluation using feature representations based on ADMM sparse codes.
- Author
-
Qizi, Huangpeng, Huang, Wenwei, and Shi, Hanyi
- Subjects
- *
COMPUTER vision , *SPARSE approximations , *SPARSE graphs , *MATHEMATICAL optimization , *INFORMATION resources management - Abstract
To automatic detect and characterize paper impurities with computer vision, we present a novel two parts evaluation procedure with feature representations using Alternating Direction Method of Multipliers (ADMM) sparse codes. The method is based on an offline training step to obtain sparse coefficients and codebooks via learning extracted features with ADMM optimization, followed by an online detection step to use linear SVM classifier to assess defective paper samples from no-defective ones. Our approach bridges the gap between paper impurities evaluation and sparse feature representations, taking advantages of existing ADMM algorithms to handle sparse codes problem. We compare different feature descriptors and sparse code methods to implement the procedure and experimentally validate it on a dataset of 11 paper classes. Experiment results show that the proposed method is competitive and effective in terms of evaluation accuracy and speed. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
46. Partial-ACO Mutation Strategies to Scale-Up Fleet Optimisation and Improve Air Quality (Best Application Paper)
- Author
-
Darren M. Chitty
- Subjects
Mathematical optimization ,Tree traversal ,Traverse ,Heuristic (computer science) ,Computer science ,Range (aeronautics) ,media_common.quotation_subject ,Mutation (genetic algorithm) ,Quality (business) ,Ant colony ,Air quality index ,media_common - Abstract
Fleet optimisation can significantly reduce the time vehicles spend traversing road networks leading to lower costs and increased capacity. Moreover, reduced road use leads to lower emissions and improved air quality. Heuristic approaches such as Ant Colony Optimisation (ACO) are effective at solving fleet optimisation but scale poorly when dealing with larger fleets. The Partial-ACO technique has substantially improved ACO’s capacity to optimise large scale vehicle fleets but there is still much scope for improvement. A method to achieve this could be to integrate simple mutation with Partial-ACO as used by other heuristic methods. This paper explores a range of mutation strategies for Partial-ACO to both improve solution quality and reduce computational costs. It is found that substituting a majority of ant simulations with simple mutation operations instead improves both the accuracy and efficiency of Partial-ACO. For real-world fleet optimisation problems of up to 45 vehicles and 437 jobs reductions in fleet traversal of approximately 50% are achieved with much less computational cost enabling larger scale problems to be tackled. Moreover, CO\(_{2}\) and NO\(_{\text {x}}\) emissions are cut by 3.75 Kg and 1.71 g per vehicle a day respectively improving urban air quality.
- Published
- 2020
47. Optimization of Sustainable Single-Machine Scheduling Problem : Short Research Paper, CSCI-ISCI
- Author
-
Dalila B.M.M. Fontes and S. Mahdi Homayouni
- Subjects
Mathematical optimization ,Single-machine scheduling ,Job shop scheduling ,Computer science ,Tardiness ,Genetic algorithm ,Scheduling (production processes) ,Manufacturing operations ,Energy consumption ,Preventive maintenance - Abstract
This work considers sustainable scheduling of manufacturing operations and preventive maintenance activities in a single-machine environment where the machine works continuously in three eight-hour shifts per day. The jobs can be produced at different processing speeds, which reduces energy consumption and/or processing times. In a tri-objective mixed integer linear programming model, sustainability is attained through minimizing total weighted earliness/ tardiness - economic pillar, total energy consumption - environmental pillar, and number of undesired activities - social pillar. Moreover, a multi-objective genetic algorithm finds near optimal solutions in a timely manner. Numerical results will be presented at the conference.
- Published
- 2020
48. Review Paper on Implementation of Particle Swarm Optimization for Multi-Pass Milling Operation
- Author
-
Dr.Vishvajeet Potdar and Sagar Bahirje
- Subjects
Mathematical optimization ,Computer science ,Particle swarm optimization - Published
- 2020
49. Keynote Paper: From EDA to IoT eHealth: Promises, Challenges, and Solutions.
- Author
-
Firouzi, Farshad, Farahani, Bahar, Ibrahim, Mohamed, and Chakrabarty, Krishnendu
- Subjects
- *
INTERNET of things , *TELEMEDICINE , *INTELLIGENT sensors , *MATHEMATICAL optimization , *BIG data , *SCALABILITY - Abstract
The interaction between technology and healthcare has a long history. However, recent years have witnessed the rapid growth and adoption of the Internet of Things (IoT) paradigm, the advent of miniature wearable biosensors, and research advances in big data techniques for effective manipulation of large, multiscale, multimodal, distributed, and heterogeneous data sets. These advances have generated new opportunities for personalized precision eHealth and mHealth services. IoT heralds a paradigm shift in the healthcare horizon by providing many advantages, including availability and accessibility, ability to personalize and tailor content, and cost-effective delivery. Although IoT eHealth has vastly expanded the possibilities to fulfill a number of existing healthcare needs, many challenges must still be addressed in order to develop consistent, suitable, safe, flexible, and power-efficient systems that are suitable fit for medical needs. To enable this transformation, it is necessary for a large number of significant technological advancements in the hardware and software communities to come together. This keynote paper addresses all these important aspects of novel IoT technologies for smart healthcare-wearable sensors, body area sensors, advanced pervasive healthcare systems, and big data analytics. It identifies new perspectives and highlights compelling research issues and challenges, such as scalability, interoperability, device-network-human interfaces, and security, with various case studies. In addition, with the help of examples, we show how knowledge from CAD areas, such as large scale analysis and optimization techniques can be applied to the important problems of eHealth. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
50. Submodular Memetic Approximation for Multiobjective Parallel Test Paper Generation
- Author
-
Minh Luan Nguyen, Siu Cheung Hui, and Alvis Fong
- Subjects
Mathematical optimization ,021103 operations research ,Linear programming ,business.industry ,Maximum coverage problem ,0211 other engineering and technologies ,Approximation algorithm ,02 engineering and technology ,Multi-objective optimization ,Computer Science Applications ,Submodular set function ,Human-Computer Interaction ,Control and Systems Engineering ,0202 electrical engineering, electronic engineering, information engineering ,Memetic algorithm ,020201 artificial intelligence & image processing ,Algorithm design ,Local search (optimization) ,Electrical and Electronic Engineering ,business ,Software ,Information Systems ,Mathematics - Abstract
Parallel test paper generation is a biobjective distributed resource optimization problem, which aims to generate multiple similarly optimal test papers automatically according to multiple user-specified assessment criteria. Generating high-quality parallel test papers is challenging due to its NP-hardness in both of the collective objective functions. In this paper, we propose a submodular memetic approximation algorithm for solving this problem. The proposed algorithm is an adaptive memetic algorithm (MA), which exploits the submodular property of the collective objective functions to design greedy-based approximation algorithms for enhancing steps of the multiobjective MA. Synergizing the intensification of submodular local search mechanism with the diversification of the population-based submodular crossover operator, our algorithm can jointly optimize the total quality maximization objective and the fairness quality maximization objective. Our MA can achieve provable near-optimal solutions in a huge search space of large datasets in efficient polynomial runtime. Performance results on various datasets have shown that our algorithm has drastically outperformed the current techniques in terms of paper quality and runtime efficiency.
- Published
- 2017
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.