Back to Search Start Over

A tutorial on the method of moments

Authors :
Arvas, Ercüment
Sevgi, Levent
Doğuş Üniversitesi, Mühendislik Fakültesi, Elektronik ve Haberleşme Mühendisliği Bölümü
TR219590
TR46127
Belirlenecek
Publication Year :
2012
Publisher :
IEEE, 2012.

Abstract

The Method of Moments (MoM) is a numerical technique used to approximately solve linear operator equations, such as differential equations or integral equations. The unknown function is approximated by a finite series of known expansion functions with unknown expansion coefficients. The approximate function is substituted into the original operator equation, and the resulting approximate equation is tested so that the weighted residual is zero. This results in a number of simultaneous algebraic equations for the unknown coefficients. These equations are then solved using matrix calculus. MoM has been used to solve a vast number of electromagnetic problems during the last five decades. In addition to the basic theory of MoM, some simple examples are given. To demonstrate the concept of minimizing weighted error, the Fourier series is also reviewed. IEEE Antennas and Propagation Society

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.od......2662..05dfb849deb7c33a52e0cab0ee91c054