103 results
Search Results
2. SEP Royalty Rate Calculation on the Basis of the Present Value Added (PVA): A Model and its Application in the German Legal Order
- Author
-
Fabian Hoffmann and Jan Schmitz
- Subjects
History ,Discounting ,Polymers and Plastics ,Present value ,Computer science ,Context (language use) ,Industrial and Manufacturing Engineering ,Microeconomics ,Order (exchange) ,Apportionment ,Market power ,Business and International Management ,License ,Valuation (finance) - Abstract
The most common method for standard essential patent (SEP) valuation and to determine a fair, reasonable and non-discriminative (FRAND) royalty rate is based on comparable license agreements. This method is relatively simple to apply, but has several disadvantages. Firstly, license agreements are often not comparable on important points. Secondly, they are usually based on a selection by the submitting party, who has an interest to select only favorable license agreements for comparison. And thirdly, those selected agreements might not result from a fair negotiation, because of possible market power by either of the two sides. In this paper, we therefore suggest how the present value added (PVA) methodology can be used to calculate the FRAND value of specific SEP technologies, avoiding biases that are intrinsically linked to the comparable licensing approach. The PVA approach can also serve as valuation methodology for the whole standard in the context of the Top-Down approach. To illustrate the approach, we present a simple interperiodical valuation model. To derive a FRAND royalty rate, we then suggest how the PVA created can be apportioned between the SEP holders and the implementer. The second contribution of this paper is therefore the normative discussion of the apportionment of the PVA. Finally, we show how the PVA methodology can be practically applied by running a hypothetical example of a full FRAND royalty estimation. The German legal framework has been chosen, since it is the European jurisdiction where an increasingly large number of legal disputes related to SEPs are adjudicated.
- Published
- 2021
3. Another look at risk apportionment
- Author
-
Béatrice Rey, Michel Denuit, Institut de Statistique, Biostatistique et Sciences Actuarielles (ISBA), Université Catholique de Louvain = Catholic University of Louvain (UCL), Laboratoire de Sciences Actuarielle et Financière (SAF), Université Claude Bernard Lyon 1 (UCBL), and Université de Lyon-Université de Lyon
- Subjects
Economics and Econometrics ,Applied Mathematics ,05 social sciences ,Multiplicative function ,Univariate ,Stochastic dominance ,Bivariate analysis ,Function (mathematics) ,[SHS.ECO]Humanities and Social Sciences/Economics and Finance ,01 natural sciences ,010104 statistics & probability ,Sampling distribution ,Apportionment ,0502 economics and business ,Economics ,Econometrics ,050207 economics ,0101 mathematics ,ComputingMilieux_MISCELLANEOUS ,Expected utility hypothesis - Abstract
This paper presents a general result on the random selection of an element from an ordered sequence of risks and uses this result to derive additive and cross risk apportionment. Preferences favoring an improvement of the sampling distribution in univariate or bivariate first-order stochastic dominance are those exhibiting additive or cross risk apportionment. The univariate additive and multiplicative risk apportionment concepts are then related to the notion of bivariate cross risk apportionment by viewing the single-attribute utility function of an aggregate position (sum or product of attributes) as a 2-attribute utility function. The results derived in the present paper allow one to further explore the connections between the different concepts of risk apportionment proposed so far in the literature.
- Published
- 2013
4. Application of receptor modeling methods
- Author
-
Philip K. Hopke and David D. Cohen
- Subjects
Atmospheric Science ,Operations research ,Computer science ,Chemical mass balance ,Pollution ,Data science ,Field (computer science) ,Variety (cybernetics) ,Receptor models ,Identification (information) ,Positive matrix factorization ,Modelling methods ,Apportionment ,Unmix ,Data analysis ,Factor analysis ,Analysis tools ,Waste Management and Disposal - Abstract
The use of atmospheric compositional data for the identification and apportionment of sources has been ongoing for more than 40 years. Beginning in the 1960s, it was recognized that data analysis techniques could be applied to data and resolve combination of constituents that represent sources. In the late 1970s, these data analysis tools came to be called Receptor Models. This paper traces the early history of receptor models through those early papers and provides a historical introduction to the paper in this special issue showing the state of the art in the field and the application of these modern tools to a variety of atmospheric data.
- Published
- 2011
5. Formula Apportionment or Separate Accounting? Tax-Induced Distortions of Multinationals' Locational Investment Decisions
- Author
-
Erich Pummerer and Regina Ortmann
- Subjects
History ,Polymers and Plastics ,Public economics ,Investment (macroeconomics) ,Industrial and Manufacturing Engineering ,Investment decisions ,Market economy ,European policy ,Economic substance ,Multinational corporation ,Investment incentives ,Apportionment ,Return on investment ,Economics ,Business and International Management - Abstract
We examine which tax allocation system leads to more severe distortions with respect to locational investment decisions. We consider separate accounting (SA) and formula apportionment (FA). The effects of both systems have been hotly debated in Europe in the past years. The reason is that the EU Member States are striving to implement a common European tax system that would lead to a switch from SA to FA. While existing studies focus primarily on the impact of taxes on locational decisions under either SA or FA, the main innovation of this paper is that it compares both systems with regard to the level of distortions they induce. We compare the optimal pre-tax investment decision with the optimal after-tax investment decision and infer from the difference in the allocation of investment funds which tax allocation system causes more severe distortions. We assume that the multinational group (MNG) has comprehensive book income shifting opportunities under SA. We find that the investment incentives under SA are opposed to those under FA for a profitable investment project. Whereas under SA as much as possible should be invested in a high-tax country, under FA as much as possible should be invested in a low-tax country. The distortions of locational investment decisions tend to be more severe under SA than under FA if a greater share of investment funds is to be invested in a low-tax country from a pre-tax perspective and the investment is profitable. Vice versa, locational decisions may be more distorted under FA if the optimal pre-tax investment decision requires investing a major share of funds in the high-tax country. In contrast to the often stated insensitivity of FA towards income shifting, we find the introduction of a tax allocation system based on FA in Europe could lead to a severe shift of economic substance to low-tax countries. The results of this paper are of particular interest for European policy makers and MNGs as our findings may induce European MNGs to reassess their recent locational investment decisions in the face of a potential future change in the applied tax allocation system. (authors' abstract)
- Published
- 2015
6. Another Look at 'Coveting Thy Neighbor's Manufacturing'
- Author
-
David Merriman
- Subjects
Engineering ,Standard error ,Payroll ,Apportionment ,business.industry ,Income tax ,As is ,Econometrics ,Distribution (economics) ,Sample (statistics) ,Replicate ,business - Abstract
Goolsbee and Maydew (2000) reported that lowering the weight on payroll in states’ corporate income tax apportionment formulae had the potential to raise manufacturing employment. Their analyses continue to be cited in academic articles and are still influential in the policy debate. I gather data and attempt to replicate their analyses and findings. I identify an apparent but inconsequential error in G&M’s sample and I replicate the most widely cited result in the original paper. Other results are substantively but not quantitatively replicated. I show that G&M’s results are sensitive to relatively arbitrary choices about the sample that is used. I argue that the most cited result in the paper does not come from the most preferred econometric specification and that when the most preferred econometric specification is used G&M’s original paper found no statistically significant evidence that lowering the apportionment weight on payroll raises employment. Similarly when I use this specification with data covering the period G&M studied (1978 to 1994) I find no statistically significant evidence for this hypothesis. When I extend the data set forward in time to 2010 and rerun the same specifications I find results similar to those found with the earlier data but I find increased statistical significance when standard errors are not clustered by state. When standard errors are clustered by state, as is now common econometric practice, none of the key estimated coefficients are statistically significant. I get very similar results whether I use manufacturing employment or manufacturing payroll as the dependent variable.In summary, econometric evidence to support the hypothesis that changes in the payroll weight affected the distribution of manufacturing employment among US states in the 1978 to 1994 period appears less strong than G&M asserted even when using G&M’s data and methods. However, more recent data lends some support to G&M’s conclusion but the econometric evidence is still weak by current standards.
- Published
- 2014
7. CCCTB The Employment Factor Game
- Author
-
Matthias Petutschnig and Eva Eberhartinger
- Subjects
050502 law ,Economics and Econometrics ,Public economics ,05 social sciences ,Directive ,Taxable income ,Tax rate ,Microeconomics ,Apportionment ,Multinational corporation ,0502 economics and business ,Member state ,Economics ,Business ,050207 economics ,Business and International Management ,Formulary apportionment ,Law ,Game theory ,0505 law ,Public finance - Abstract
The draft for a CCCTB Directive in the EU includes the suggestion for an apportionment formula which allocates taxable profits to group member corporations and to the respective Member States. The draft directive delegates the right to define one apportionment factor, the term ‘Employee’ to the Member States, which can choose a narrow or a broad definition, the latter including also atypical employment schemes. Using a game-theoretic approach we show that defining ‘Employee’ broadly so as to maximize the Member State’s share in the apportionment factor is only optimal when tax rate differences and different sizes of atypical employment schemes are disregarded. If such differentials and the multinational corporation’s reactions to different domestic definitions are included a narrow definition of ‘Employee’ yields the highest individual pay-offs to the countries involved. Our paper differs from previous research on the economic effects of the CCCTB apportionment formula as it is the first analysing the employment factor and its distortive effects. We discuss possible tax minimizing strategies for multinationals by shifting employment and develop a model to quantify these potential relocations. The results of our paper may be relevant for the European Commission and the Council, when debating the details of formula apportionment. Furthermore we show how Member States could use the ‘Employee’ definition to both minimize factor emigration and maximize factor immigration, if the factor definitions remain unchanged.
- Published
- 2014
8. Apportionment of sulfur oxides at Canyonlands during the winter of 1990—III. Source apportionment of SOx, and sulfate and the conversion of S02 to sulfate in the Green River Basin
- Author
-
Delbert J. Eatough, Norman L. Eatough, and Michele Eatough
- Subjects
Hydrology ,Atmospheric Science ,geography ,geography.geographical_feature_category ,National park ,Air pollution ,Drainage basin ,chemistry.chemical_element ,Chemical mass balance ,Particulates ,medicine.disease_cause ,Sulfur ,chemistry.chemical_compound ,chemistry ,Apportionment ,medicine ,Environmental science ,Sulfate ,General Environmental Science - Abstract
During January-March of 1990 a study was conducted to determine the sources of sulfur oxide emissions present in the Canyonlands area in Utah. Samples were collected at the Island-in-the-Sky visitors center in Canyonlands National Park and at several sites circling the park to characterize the chemical composition of air masses influencing the receptor sites from various geographical regions. The results of the sampling program and the identified chemical fingerprints of the sources that can impact the Canyonlands receptor site have been given in the first two papers in a series of three papers. In this paper are presented the results of chemical mass balance source apportionment of the SOx, present at Canyonlands, Green River, Bullfrog Marina and Edge of the Cedars, Utah, using the chemical composition and source fingerprint data given in the preceding papers. The results indicate that the presence of sulfur oxides at the Canyonlands area is a regional problem not dominated by a single source. The contributions to SOx, at Canyonlands during the 21 days included in the source apportionment analyses included sources to the southwest (37%),south and southeast (20%), north and northeast (19%), and northwest (23%). At the Edge of the Cedars State Park to the southeast of Canyonlands, sources from the southeast contributed 51 % of the observed SOx. At Bullfrog Marina in the Lake Powell National Recreation Area, southwest of Canyonlands, sources to the southwest were responsible for 81 % of the SOx present. At Green River, to the north of Canyonlands, the contribution of sources to the north and northeast were reduced (10%) because the major transport path of sources from these directions was the movement of emission from northwestern Colorado down the Colorado River drainage and south of Green River. The apportionment of sulfate at Canyonlands has been estimated from a combination of the chemical mass balance SOx source apportionment results, the measured concentrations of S02 and particulate sulfate, and meteorological data. This analysis indicates that, while the main source of SOx at Canyonlands is from emissions to the southwest, the main source of sulfate is from SOx emissions originating from the southeast of Canyonlands.
- Published
- 1996
9. The Apportionment of Takeover Wealth Gains Over Investor Groups
- Author
-
Anna Christine McAdam
- Subjects
Extant taxon ,Shareholder ,Financial economics ,Apportionment ,Realisation ,Business ,Profit (economics) - Abstract
This paper identifies and apportions the wealth gain realisation from a takeover event for four investor groups, which are the domestic domiciled investor categories superannuation funds, nominees, incorporated companies, and individuals. For this analysis, it employs a trading profit performance. In addition, for robustness, a marked-to-market return is also determined for a small, medium, and large order size proxy. In regards to the MTM analysis, it was distinguished between passive and aggressive trades. This study finds that the investors/proxies identified as informed traders realise a higher performance from the takeover occasion. The present study results thus qualify the extant takeover research conclusion that target firm shareholders are the ‘winners’ from a takeover occasion. This research concludes that it is the informed investor who ‘wins’ and furthermore, that they do so at the expense of the less informed trader. This paper consequently provides a more complete description and understanding of the wealth gains from takeovers, which was the research goal.
- Published
- 2011
10. Fairness in ATM networks
- Author
-
Moshe Zukerman and Sammy Chan
- Subjects
Focus (computing) ,business.industry ,Computer science ,Apportionment ,Spare part ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,General Engineering ,Bandwidth (computing) ,Broadband Integrated Services Digital Network ,business ,Variable bitrate ,Statistical time division multiplexing ,Computer network - Abstract
In the future broadband ISDN which will be based on statistical multiplexing, generally speaking, three traffic categories may be defined: (1) guaranteed bandwidth traffic, (2) non-guaranteed variable bit rate video traffic, and (3) other non-guaranteed variable bit rate traffic (mainly data) which will be competing for the spare capacity. The special focus of this paper is on a fairness definition for the third traffic category as applied mainly to customer premises or local ATM networks. The paper provides a quantitative standard for the apportionment of the spare capacity among traffic streams of the third category, taking into consideration special capacity requirements of different sources.
- Published
- 1993
11. Insights into cooking sources in the context of sustainable development goals
- Author
-
Jing Li
- Subjects
Sustainable development ,Environmental Engineering ,Process management ,Renewable Energy, Sustainability and the Environment ,Computer science ,Ecological environment ,020209 energy ,food and beverages ,Context (language use) ,02 engineering and technology ,010501 environmental sciences ,01 natural sciences ,Industrial and Manufacturing Engineering ,Identification (information) ,Human health ,Apportionment ,0202 electrical engineering, electronic engineering, information engineering ,Environmental Chemistry ,0105 earth and related environmental sciences - Abstract
There need to be continual improvements in the research methods used in cooking studies to benefit the ecological environment and human health. The literature must be systematically reviewed to facilitate the discovery, optimization, and rational design of cooking studies in the context of sustainable development goals. Thus, this paper raises several important design considerations for cooking studies that should be implemented in future measurements. Herein, 131 scientific articles were selected and analyzed according to four aspects: (1) influential factors, (2) cooking emissions, (3) health effects, and (4) source apportionment. The analysis led to the identification of representative test methods and a perspective on the future of research in the identified aspects.
- Published
- 2021
12. Corporate Average Tax Rates Under the CCCTB and Possible Methods for International Loss-Offset
- Author
-
Reinald Koch and Andreas Oestreicher
- Subjects
050208 finance ,Public economics ,05 social sciences ,1. No poverty ,Value-added tax ,Common Consolidated Corporate Tax Base ,Incentive ,Apportionment ,Multinational corporation ,8. Economic growth ,0502 economics and business ,media_common.cataloged_instance ,Business ,Neutrality ,050207 economics ,European union ,Corporate tax ,media_common - Abstract
This paper provides an assessment of the potential consequences for average corporate tax rates that would result from implementation of a Common Consolidated Corporate Tax Base (CCCTB) as proposed by the European Commission, and of possible methods for achieving an EU-wide loss-offset for multinational groups. To this end, we apply a comparative-static micro-simulation approach based on ten-year data for a sample of 119,645 European domestic and multinational groups taken from the AMADEUS database. We find that through making CCCTB mandatory in the EU, the extension of intra-group loss-offset possibilities and use of formula apportionment would enhance the attractiveness of the EU as an investment location. The mean of the average tax rates in the member states would be reduced by 0.77 per cent for domestic groups and by 2.15 per cent for multinational groups. Furthermore, the paper reveals that a mandatory CCCTB would reduce the variation in average tax rates across the member states as well as the existing differences among domestic and multinational groups. An optional CCCTB, however, would increase the inter-jurisdictional variations between average tax rates and favours multinational groups, thus creating incentives for cross-border investment. The methods for cross-border group relief considered here would constitute less efficient mechanisms for achieving loss-offset than a CCCTB. Our findings show that these alternative loss-offset mechanisms would not guarantee relief to the same extent and are less effective in terms of inter-jurisdictional and intersectoral neutrality. However, in contrast to optional CCCTB each of the three methods would ensure equitable treatment of domestic and multinational groups.
- Published
- 2008
13. Patent Damages Reform and the Shape of Patent Law
- Author
-
David W. Opderbeck
- Subjects
Price elasticity of demand ,Patent troll ,Apportionment ,Political science ,Law ,Liability ,Damages ,Patent infringement ,Empirical evidence ,Supreme court ,Law and economics - Abstract
The shape of patent law is changing. Surprisingly, one of the most significant of these changes is rooted in the arcana of how damages are calculated for patent infringement. Current reform proposals before Congress, which are hotly contested by major technology-rich industries, would radically alter the shape of the patent grant by requiring courts to tease out the "economic value" of the claimed invention as compared with previously existing technology. This paper responds empirically and theoretically to this attempt to reshape patent law through the back door of damages.Advocates of the damages reform proposals cite empirical evidence that patent verdicts are growing excessively large. This paper reviews the existing empirical literature and presents an original study of patent verdict data obtained from the Administrative Office of the Courts. The literature review and original study presented in this paper suggest that the empirical arguments made by reform advocates are largely misplaced.This paper also examines the theoretical underpinnings of the remedial structure for patent infringement. It discusses a string of recent Supreme Court opinions in which patent law appears to be moving from a property rule towards a liability rule of remedies. Finally, the paper examines two key factors that have been ignored in the existing patent reform debate: price elasticity of demand and risk. Theoretical models are presented that demonstrate why attempts at reform should focus on shifting towards a restitutionary model of patent damages, with a possible premium for risk.
- Published
- 2008
14. Risk Apportionment: A Story of Moments?
- Author
-
Patrick Roger
- Subjects
Combinatorics ,Lottery ,Apportionment ,Simple (abstract algebra) ,Generalization ,Economics ,Order (group theory) ,Risk aversion (psychology) ,Function (mathematics) ,Mathematical economics ,Sign (mathematics) - Abstract
In a recent paper, Louis Eeckhoudt and Harris Schlesinger [1] established a theorem linking the sign of the n-th derivative of an agent's utility function to her preferences among pairs of simple lotteries. In this paper, we characterize these lotteries and show that they only differ by their moments of order greater than or equal to n. When the n-th derivative is positive (negative) and n is odd (even), the agent prefers a lottery with higher (lower) n+2j-th moments for j belonging to N. This result is a generalization of a proposition appearing in Ekern [2] which focused only on the difference of the n-th moments.
- Published
- 2007
15. Source apportionment of PM2.5 concentrations with a Bayesian hierarchical model on latent source profiles
- Author
-
Jing-Shiang Hwang, Jia-Hong Tang, and Shih-Chun Candice Lung
- Subjects
Pollution ,Atmospheric Science ,Multivariate statistics ,010504 meteorology & atmospheric sciences ,media_common.quotation_subject ,010501 environmental sciences ,Particulates ,computer.software_genre ,01 natural sciences ,Apportionment ,Simulated data ,Environmental science ,Bayesian hierarchical modeling ,Bayesian framework ,Data mining ,Waste Management and Disposal ,computer ,0105 earth and related environmental sciences ,media_common - Abstract
Identifying realistic pollution source profiles and quantifying the contributions of atmospheric particulate matter are crucial for the development of pollution mitigation strategies to protect public health. In this paper, we proposed a multivariate source apportionment model by using a Bayesian framework for latent source profiles to incorporate expert knowledge regarding emissions that can facilitate source profile estimation, and atmospheric effects, such as meteorological conditions, can improve source concentration estimations. This approach can maintain positivity and summation constraints for source contributions and profiles. Furthermore, available expert knowledge regarding source profiles is incorporated as prior knowledge to avoid restrictive assumptions regarding the presence or absence of chemical constituent tracers in source profile modeling. We used long-term PM2.5 measurements collected from two locations with different environmental characteristics in northern Taiwan to demonstrate the feasibility of the proposed model and evaluated its performance by using simulated data.
- Published
- 2020
16. A review of surface ozone source apportionment in China
- Author
-
Hailing Liu, Meigen Zhang, and Xiao Han
- Subjects
lcsh:GE1-350 ,Atmospheric Science ,010504 meteorology & atmospheric sciences ,literature review ,long-term trends ,ozone source apportionment ,Air pollution ,Particulates ,010502 geochemistry & geophysics ,Oceanography ,medicine.disease_cause ,01 natural sciences ,lcsh:Oceanography ,surface ozone ,Surface ozone ,Apportionment ,Environmental chemistry ,medicine ,Environmental science ,lcsh:GC1-1581 ,China ,lcsh:Environmental sciences ,0105 earth and related environmental sciences - Abstract
Air pollution caused by particulate matter has significantly improved in China in recent years since the implementation of a series of stringent clean-air regulations. However, surface ozone concentrations have increased, especially in developed city clusters, such as the Beijing–Tianjin–Hebei, Yangtze River Delta, Pearl River Delta, and Sichuan Basin regions. Due to the complexity and nonlinearity of the ozone formation, accurately locating major sources of ozone and its precursors is an important basis for the formulation of cost-effective pollution control strategies. In this paper, the authors systematically summarize the reported results and outcomes of the methods and main conclusions of ozone source apportionment (regions and categories) in China from the published literature, based on observation-based methods and emission-based methods, respectively. The authors aim to provide a comprehensive understanding of ozone pollution and reliable references for the formulation of air pollution prevention policies in China.
- Published
- 2020
17. Corrigendum to 'Chemical mass balance source apportionment for combined PM2.5 measurements from U.S. non-urban and urban long-term networks' [Atmos. Environ. 44 (2010) 4908–4918]
- Author
-
Lisa Herschberger, David Dubois, Judith C. Chow, L.-W. Antony Chen, and John G. Watson
- Subjects
Hydrology ,Total organic carbon ,Atmospheric Science ,Chemistry ,Sampling (statistics) ,Chemical mass balance ,Table (information) ,Atmospheric sciences ,complex mixtures ,Term (time) ,chemistry.chemical_compound ,Nitrate ,Apportionment ,Sulfate ,General Environmental Science - Abstract
A mislabeling of sampling sites was discovered after final proofing of the paper. GRRI data were labeled as BLMO and BLMO data were labeled as GRRI in Table 2e3, Figs. 2 and 3, and supplemental Table S1 and S4. Texts that explain the tables and figures also need to be corrected. Since the two sites were discussed as a group for the most part, the mislabeling does not change the conclusions of the paper except for Section 5.2: Inter-network differences. In Section 5.2, GRRI source apportionment results were compared with those from ROC (see the 5th and 6th paragraphs). It was concluded that ROC and GRRI (really BLMO) showed similar contributions from regional sources secondary sulfate (including coal-fired utility), secondary nitrate, and soil dust. When ROC is compared with the actual GRRI, secondary sulfate contributions are similar (2.67 versus 2.72 mgm 3), but ROC shows higher secondary nitrate, secondary organic carbon, soil dust, Ca-rich dust, engine exhaust, and road salt contributions than GRRI. Biomass burning contributions are 36e39% lower at ROC. These differences may be explained by local activities around ROC (e.g., traffic, construction, and de-icing) and GRRI (e.g., campfires). The reconstructed PM2.5 concentrations were 102% and 99% of measured PM2.5 mass at ROC and GRRI, respectively. Other changes include
- Published
- 2012
18. PM2.5 source apportionment for the port city of Thessaloniki, Greece
- Author
-
Dikaia Saraga, Evangelos I. Tolis, Thomas Maggos, Christos Vasilakos, and John G. Bartzis
- Subjects
Road dust ,Pollution ,Environmental Engineering ,010504 meteorology & atmospheric sciences ,media_common.quotation_subject ,Sampling (statistics) ,010501 environmental sciences ,01 natural sciences ,Port (computer networking) ,Aerosol ,Apportionment ,Environmental Chemistry ,Environmental science ,Physical geography ,Biomass burning ,Waste Management and Disposal ,Air quality index ,0105 earth and related environmental sciences ,media_common - Abstract
This paper aims to identify the chemical fingerprints of potential PM2.5 sources and estimate their contribution to Thessaloniki port-city's air quality. For this scope, Positive Matrix Factorization model was applied on a comprehensive PM2.5 dataset collected over a one-year period, at two sampling sites: the port and the city center. The model indicated six and five (groups of) sources contributing to particle concentration at the two sites, respectively. Traffic and biomass burning (winter months) comprise the major local PM sources for Thessaloniki (their combined contribution can exceed 70%), revealing two of the major control-demanding problems of the city. Shipping and in-port emissions have a non-negligible impact (average contribution to PM2.5: 9-13%) on both primary and secondary particles. Road dust factor presents different profile and contribution at the two sites (19.7% at the port; 7.4% at the city center). The secondary-particle factor represents not only the aerosol transportation over relatively long distances, but also a part of traffic-related pollution (14% at the port; 34% at the city center). The study aims to contribute to the principal role of quantitative information on emission sources (source apportionment) in port-cities for the implementation of the air quality directives and guidelines for public health.
- Published
- 2019
19. Data driven estimation of novel COVID-19 transmission risks through hybrid soft-computing techniques
- Author
-
Rashmi Bhardwaj and Aashima Bangia
- Subjects
Soft computing ,General Mathematics ,Applied Mathematics ,Hybrid Wavelet Neuronal-Fuzzification ,Outbreak ,Wavelet decomposition ,nCoV-19 ,General Physics and Astronomy ,Statistical and Nonlinear Physics ,01 natural sciences ,Mean Absolute Scaled Error (MASE) ,Article ,010305 fluids & plasmas ,Wavelet ,Geography ,Moving average ,Skewness ,Apportionment ,0103 physical sciences ,Pandemic ,Statistics ,Transmission risk ,Symmetric Mean Absolute Percentage Error (sMAPE) ,China ,010301 acoustics - Abstract
Coronavirus genomic infection-2019 (COVID-19) has been announced as a serious health emergency arising international awareness due to its spread to 201 countries at present. In the month of April of the year 2020, it has certainly taken the pandemic outbreak of approximately 11,16,643 infections confirmed leading to around 59,170 deaths have been recorded world-over. This article studies multiple countries-based pandemic spread for the development of the COVID-19 originated in the China. This paper focuses on forecasting via real-time responses data to inherit an idea about the increase and maximum number of virus-infected cases for the various regions. In addition, it will help to understand the panic that surrounds this nCoV-19 for some intensely affecting states possessing different important demographic characteristics that would be affecting the disease characteristics. This study aims at developing soft-computing hybrid models for calculating the transmissibility of this genome viral. The analysis aids the study of the outbreak of this virus towards the other parts of the continent and the world. A hybrid of wavelet decomposed data into approximations and details then trained & tested through neuronal-fuzzification approach. Wavelet-based forecasting model predicts for shorter time span such as five to ten days advanced number of confirmed, death and recovered cases of China, India and USA. While data-based prediction through interpolation applied through moving average predicts for longer time spans such as 50-60 days ahead with lesser accuracy as compared to that of wavelet-based hybrids. Based on the simulations, the significance level (alpha) ranges from 0.10 to 0.67, MASE varying from 0.06 to 5.76, sMAPE ranges from 0.15 to 1.97, MAE varies from 22.59 to 6024.76, RMSE shows a variation from 3.18 to 8360.29 & R2 varying through 0.0018 to 0.7149. MASE and sMAPE are relatively lesser applied and novel measures that aimed to achieve increase in accuracy. They eliminated skewness and made the model outlier-free. Estimates of the awaited outburst for regions in this study are India, China and the USA that will help in the improvement of apportionment of healthcare facilities as it can act as an early-warning system for government policy-makers. Thus, data-driven analysis will provide deep insights into the study of transmission of this viral genome estimation towards immensely affected countries. Also, the study with the help of transmission concern aims to eradicate the panic and stigma that has spread like wildfire and has become a significant part of this pandemic in these times.
- Published
- 2020
- Full Text
- View/download PDF
20. Potential cost implications of contracting risks – the views of bus operators in South Africa
- Author
-
Jackie Walters
- Subjects
Finance ,050210 logistics & transportation ,Government ,business.industry ,05 social sciences ,Economics, Econometrics and Finance (miscellaneous) ,Transportation ,Operator (computer programming) ,Apportionment ,Public transport ,0502 economics and business ,business ,health care economics and organizations ,050203 business & management ,Cost implications - Abstract
In South Africa, like many countries internationally, public transport contracting costs are continuously being scrutinised in the light of tight economic circumstances. In designing public transport contracts it is important that an appropriate risk-share dispensation be considered to ensure that the relevant entity (the authority and/or operator) carries the risk that it is best suited to manage. Inappropriate risk-sharing arrangements can result in additional costs being factored into contract bids by operators thus increasing the overall cost of public transport for the authority. In addition, the design of the contract e.g. net cost versus gross cost (and associated risk apportionment) could have a bearing on the ultimate cost of the contract. This paper explores the risk views of 15 contracted bus operators representing 4950 buses in South Africa, based on their experiences of such contracts over several years of public transport contracting. The lessons learned from this research will assist contracting authorities in understanding how operators respond and view risks associated with various controllable and uncontrollable risks related to public transport contracting as it is the South African government's intention to embark on the next round of public transport contracting in 2018.
- Published
- 2018
21. Development of a chemical source apportionment decision support framework for lake catchment management
- Author
-
Sean Comber, Peter Daldorph, Michael Gardner, Brian Ellor, Carlos Constantino, and Russell Smith
- Subjects
Decision support system ,Environmental Engineering ,Geographic information system ,business.industry ,Environmental resource management ,Environmental engineering ,Legislation ,Pollution ,Polluter pays principle ,Water Framework Directive ,Apportionment ,Environmental monitoring ,Information system ,Environmental Chemistry ,Environmental science ,business ,Waste Management and Disposal - Abstract
Increasing pressures on natural resources has led to the adoption of water quality standards to protect ecological and human health. Lakes and reservoirs are particularly vulnerable to pressure on water quality owing to long residence times compared with rivers. This has raised the question of how to determine and to quantify the sources of priority chemicals (e.g. nutrients, persistent organic pollutants and metals) so that suitable measures can be taken to address failures to comply with regulatory standards. Contaminants enter lakes waters from a range of diffuse and point sources. Decision support tools and models are essential to assess the relative magnitudes of these sources and to estimate the impacts of any programmes of measures. This paper describes the development and testing of the Source Apportionment Geographical Information System (SAGIS) for future management of 763 lakes in England and Wales. The model uses readily available national data sets to estimate contributions of a number of key chemicals including nutrients (nitrogen and phosphorus), metals (copper, zinc, cadmium, lead, mercury and nickel) and organic chemicals (Polynuclear Aromatic Hydrocarbons) from multiple sector sources. Lake-specific sources are included (groundbait from angling and bird faeces) and hydrology associated with pumped inputs and abstraction. Validation data confirms the efficacy of the model to successfully predicted seasonal patterns of all types of contaminant concentrations under a number of hydrological scenarios. Such a tool has not been available on a national scale previously for such a wide range of chemicals and is currently being used to assist with future river basin planning.
- Published
- 2018
22. Comparative risk aversion with two risks
- Author
-
Kit Pong Wong
- Subjects
Economics and Econometrics ,Applied Mathematics ,05 social sciences ,Risk aversion (psychology) ,Bivariate analysis ,Measure (mathematics) ,Apportionment ,0502 economics and business ,Econometrics ,050206 economic theory ,Marginal utility ,Random variable ,050205 econometrics ,Mathematics - Abstract
This paper characterizes aversion to one risk in the presence of another, which is invulnerable to the size of exposure to the former risk and consistent with the common bivariate risk preferences for combining good with bad. We show that all bivariate utility functions that satisfy bivariate risk apportionment exhibit risk aversion with two risks if, and only if, the dependence structure of the two risks is characterized by the notion of expectation dependence. We then propose an intensity measure of risk aversion with two risks that is based on the utility premium normalized by the marginal utility evaluated at an arbitrarily chosen pair. We show that the intensity measure being uniformly larger is equivalent to the concept of greater generalized Ross risk aversion. An application for optimal prevention in a two-period model is presented when the dependence structure of the underlying random variables is governed by the notion of expectation dependence.
- Published
- 2021
23. The UK Integrated Assessment Model for source apportionment and air pollution policy applications to PM2.5
- Author
-
Anthony J. Dore, Daniel Mehlig, Helen ApSimon, Tim Oxley, Mike Holland, and Huw Woodward
- Subjects
010504 meteorology & atmospheric sciences ,PM2.5 concentrations ,Population ,Air pollution ,010501 environmental sciences ,medicine.disease_cause ,01 natural sciences ,Atmospheric Sciences ,Exceedance of WHO guideline ,Apportionment ,PM(2.5) concentrations ,medicine ,GE1-350 ,Population exposure ,education ,Integrated assessment modelling ,0105 earth and related environmental sciences ,General Environmental Science ,Pollutant ,education.field_of_study ,business.industry ,Environmental resource management ,Local scale ,Atmospheric dispersion modeling ,Environmental sciences ,Health ,Environmental science ,business - Abstract
Source apportionment and the effect of reducing individual sources is important input for the development of strategies to address air pollution. The UK Integrated Assessment Model, UKIAM, has been developed for this purpose as a flexible framework, combining information from different atmospheric dispersion models to cover different pollutant contributions, and span the range from European to local scale. In this paper we describe the UKIAM as developed for SO2, NOx, NH3, PM2.5 and VOCs. We illustrate its versatility and application with assessment of current PM2.5 concentrations and exposure of the UK population, as a case-study that has been used as the starting point to investigate potential improvement towards attainment of the WHO guideline of 10 µg/m3.
- Published
- 2021
24. Airline mitigation of propagated delays via schedule buffers: Theory and empirics
- Author
-
Achim I. Czerny, Jan K. Brueckner, and Alberto A. Gaggero
- Subjects
050210 logistics & transportation ,Schedule ,021103 operations research ,Operations research ,Computer science ,05 social sciences ,0211 other engineering and technologies ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Transportation ,02 engineering and technology ,Flight time ,Buffer (optical fiber) ,Apportionment ,0502 economics and business ,Business and International Management ,Host (network) ,Civil and Structural Engineering - Abstract
This paper presents an extensive theoretical and empirical analysis of the choice of schedule buffers by airlines. With airline delays a continuing problem around the world, such an undertaking is valuable, and its lessons extend to other passenger transportation sectors. One useful lesson from the theoretical analysis of a two-flight model is that the mitigation of delay propagation is done entirely by the ground buffer and the second flight’s buffer. The first flight’s buffer plays no role because the ground buffer is a perfect, while nondistorting, substitute. In addition, the apportionment of mitigation responsibility between the ground buffer and the second flight's buffer is shown to depend on the relationship between the costs of ground- and flight-buffer time. The empirical results show the connection between buffer magnitudes and a host of explanatory variables, including the variability of flight times, which simulations of the model identify as an important determining factor.
- Published
- 2021
25. Quantitative source apportionment and human toxicity of indoor trace metals at university buildings
- Author
-
Firoz Khan, Mohd Yasreen Ali, Mohd Talib Latif, and Marlia Mohd Hanafiah
- Subjects
Human toxicity ,Pollutant ,Environmental Engineering ,010504 meteorology & atmospheric sciences ,Geography, Planning and Development ,Environmental engineering ,Building and Construction ,010501 environmental sciences ,Particulates ,01 natural sciences ,Hazard quotient ,law.invention ,Indoor air quality ,law ,Apportionment ,Ventilation (architecture) ,Environmental science ,Inductively coupled plasma mass spectrometry ,0105 earth and related environmental sciences ,Civil and Structural Engineering - Abstract
This study focuses on the source apportionment principal component analysis of indoor particulate matter (PM 10 ) composition in two university buildings with different ventilation systems. A low volume sampler using Teflon filter paper was used to collect the PM 10 samples and inductively coupled plasma mass spectrometry was used to determine the concentration of heavy metals. The potential human health damage due to the inhalation of carcinogenic and non-carcinogenic elements was also determined based on the USEPA standard. The results showed PM 10 concentrations recorded in Building 1 and Building 2 ranged between 19.1 and 237 μg m −3 and 23.4–159 μg m −3 , respectively. In Building 1, the principal component analysis (PCA) and multiple linear regression (MLR) showed that the main sources of pollutants in PM 10 were the crustal source (20%), indoor-induced (8%), urban origin (7%) and the Earth's crust (6%). The main sources of pollutants in Building 2 were combustion (21%), biogenic (6%), anthropogenic (4%) and crustal (3%). The effective lifetime carcinogenic risks (ELCR) in Buildings 1 and 2 were 1.90E-3 and 1.65E-4, respectively. The hazard quotient (HQ) represents the non-carcinogenic risk, with 7.73 and 6.46 in Building 1 and Building 2, respectively. These ECLR and HQ values exceed the acceptable limit and are higher compared to the standard from the United States Environmental Protection Agency's Guidelines for the assessment of carcinogen risk. It was suggested that different types of ventilation influence the PM 10 distribution in buildings and associated risks towards the occupant's health and indoor air quality.
- Published
- 2017
26. Source apportionment and a novel approach of estimating regional contributions to ambient PM2.5 in Haikou, China
- Author
-
Yinchang Feng, Jianhui Wu, Jixin Gao, Xiaohui Bi, Jiao Wang, Haihang Yang, Yufen Zhang, Baoshuang Liu, Jiamei Yang, and Tingkun Li
- Subjects
010504 meteorology & atmospheric sciences ,Meteorology ,business.industry ,Health, Toxicology and Mutagenesis ,General Medicine ,010501 environmental sciences ,Toxicology ,Atmospheric sciences ,01 natural sciences ,Pollution ,Apportionment ,Environmental science ,Coal ,business ,China ,0105 earth and related environmental sciences - Abstract
A novel approach was developed to estimate regional contributions to ambient PM2.5 in Haikou, China. In this paper, the investigation was divided into two main steps. The first step: analysing the characteristics of the chemical compositions of ambient PM2.5, as well as the source profiles, and then conducting source apportionments by using the CMB and CMB-Iteration models. The second step: the development of estimation approaches for regional contributions in terms of local features of Haikou and the results of source apportionment, and estimating regional contributions to ambient PM2.5 in Haikou by this new approach. The results indicate that secondary sulphate, resuspended dust and vehicle exhaust were the major sources of ambient PM2.5 in Haikou, contributing 9.9-21.4%, 10.1-19.0% and 10.5-20.2%, respectively. Regional contributions to ambient PM2.5 in Haikou in spring, autumn and winter were 22.5%, 11.6% and 32.5%, respectively. The regional contribution in summer was assumed to be zero according to the better atmospheric quality and assumptions of this new estimation approach. The higher regional contribution in winter might be mainly attributable to the transport of polluted air originating in mainland China, especially from the north, where coal is burned for heating in winter.
- Published
- 2017
27. An energy apportionment model for a reheating furnace in a hot rolling mill – A case study
- Author
-
Biao Lu, Guang Chen, Demin Chen, and Weiping Yu
- Subjects
Engineering ,Discretization ,business.industry ,020209 energy ,Metallurgy ,Energy Engineering and Power Technology ,02 engineering and technology ,Structural engineering ,Energy consumption ,Industrial and Manufacturing Engineering ,020401 chemical engineering ,Apportionment ,0202 electrical engineering, electronic engineering, information engineering ,Production (economics) ,Rolling mill ,0204 chemical engineering ,business ,Energy source ,Beam (structure) ,Energy (signal processing) - Abstract
To master the rules of energy consumption for different types of steel billet in a reheating furnace, this paper establishes an energy apportionment model based on the actual production performance and energy source data of a walking beam reheating furnace. And the cumulative energy segment is defined, and the time period, which spans from the billet loading time to the billet unloading time, is divided into k cumulative energy segments. The formula is discretized in integral form, and the energy allocation ratio is determined in every cumulative energy segment. A case study shows that the energy allocation is different due to differences in width of the billet, the production rhythm and steel grade. The higher the width and the slower the production rhythm are, the higher the energy allocation of the steel billet is, and vice versa. Meanwhile, the energy allocation is also different in individual steel grades. The results show that the energy allocation model has significant implications in the formulation of the steel billet loading plan, the control of the production rhythm and the energy assessment and can be used to achieve energy-lean operation.
- Published
- 2017
28. Airline Mitigation of Propagated Delays: Theory and Empirics on the Choice of Schedule Buffers
- Author
-
Achim I. Czerny, Jan K. Brueckner, and Alberto A. Gaggero
- Subjects
Schedule ,Operations research ,Computer science ,Apportionment ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Host (network) ,Buffer (optical fiber) ,Connection (mathematics) - Abstract
This paper presents an extensive theoretical and empirical analysis of the choice of schedule buffers by airlines. With airline delays a continuing problem around the world, such an under-taking is valuable, and its lessons extend to other passenger transportation sectors. One useful lesson from the theoretical analysis of a two-flight model is that the mitigation of delay propagation is done entirely by the ground buffer and the second flight's buffer. The first flight's buffer plays no role because the ground buffer is a perfect, while nondistorting, substitute. In addition, the apportionment of mitigation responsibility between the ground buffer and the flight buffer of flight 2 is shown to depend on the relationship between the costs of ground-and flight-buffer time. The empirical results show the connection between buffer magnitudes and a host of explanatory variables, including the variability of flight times, which simulations of the model identify as an important determining factor.
- Published
- 2019
29. Short-term effects of anthropogenic/natural activities on the Tehran criteria air pollutants: Source apportionment and spatiotemporal variation
- Author
-
Omid Ghaffarpasand, Zahra Davari Shalamzari, and Saeed Nadi
- Subjects
Environmental Engineering ,Ozone ,Geography, Planning and Development ,0211 other engineering and technologies ,02 engineering and technology ,Building and Construction ,010501 environmental sciences ,Atmospheric sciences ,01 natural sciences ,Natural (archaeology) ,Term (time) ,Summer season ,chemistry.chemical_compound ,chemistry ,Apportionment ,Criteria air contaminants ,Environmental science ,021108 energy ,Heavy traffic ,Desert dust ,0105 earth and related environmental sciences ,Civil and Structural Engineering - Abstract
In this paper, we apply a somewhat new approach to address the data missing issue of the detailed air monitoring dataset of Tehran 2018 (hourly data of over 15 monitoring sites) and to study the short-term effects of anthropogenic/natural activities on the criteria air pollutants (CAP) in the urban and rural areas. A homogenous dataset was produced by the Singular Value Thresholding (SVT) algorithm and source apportionment was then conducted using the absolute principle component scores-multiple linear regression (APCS-MLR) method. The contributions of major sources to the local ozone level and the existing correlations between the CAP concentrations are also discussed. Results show that the vehicular emissions were the predominant contributors to the CAP concentrations, accounting for an average of 45%, and desert dust and industries around the city were ranked as the subsequent principal CAP sources for the summer season. Residential and transportation sectors accounted for an average of 59% and 18.4% of the CAP concentrations in the wintertime, respectively. Aerosols formation was accelerated during the “lunch-time” peak. This phenomenon was attributed to the heavy traffic emission of Tehran in urban environments. The concentrations of CAP, except ozone, gently increase by moving toward the cold days of the year, whereas O3 concentration usually exceeds 70 μg/m3 in summer days. On-road vehicle emissions were the leading source of ozone formation in Tehran for the studied time. The correlation coefficients of ozone and other CAP vary to some extent by moving from urban areas toward the rural areas of the city.
- Published
- 2020
30. Effective numbers in the partitioning of biological diversity
- Author
-
Hans-Rolf Gregorius
- Subjects
0106 biological sciences ,0301 basic medicine ,Statistics and Probability ,Metacommunity ,Gamma diversity ,Population ,Biodiversity ,Beta diversity ,Biology ,Models, Biological ,010603 evolutionary biology ,01 natural sciences ,General Biochemistry, Genetics and Molecular Biology ,03 medical and health sciences ,Apportionment ,education ,education.field_of_study ,General Immunology and Microbiology ,Ecology ,Applied Mathematics ,General Medicine ,030104 developmental biology ,Evolutionary biology ,Modeling and Simulation ,Alpha diversity ,General Agricultural and Biological Sciences ,Diversity (business) - Abstract
Admissible measures of diversity allow specification of the number of types (species, alleles, etc.) that are “effectively” involved in producing the diversity (the “diversity effective number”, also referred to as “true diversity”) of a community or population. In metacommunities, effective numbers additionally serve in partitioning the total diversity (symbolized by γ) into one component summarizing the diversity within communities (symbolized by α) and an independent component summarizing the differences between communities (symbolized by β). There is growing consensus that the β-component should be treated in terms of an effective number of “distinct” communities in the metacommunity. Yet, the notion of distinctness is shown in the present paper to remain conceptually ambiguous at least with respect to the diversity within the “distinct” communities. To overcome this ambiguity and to provide the means for designing further desirable effective numbers, a new approach is taken that involves a generalized concept of effective number. The approach relies on first specifying the distributional characteristics of partitioning diversity among communities (among which are differentiation, where the same types tend to occur in the same communities, and apportionment, where different types tend to occur in different communities), then developing the indices which measure these characteristics, and finally inferring the effective numbers from these indices. Major results: (1) The β-component reflects apportionment characteristics of metacommunity structure and is quantified by the “apportionment effective number” of communities (number of effectively monomorphic communities). Since differentiation between communities arises only as a side effect of apportionment, the common interpretation of the β-component in terms of differentiation is unwarranted. (2) Multiplicative as well as additive methods of partitioning the total type diversity (γ) involve apportionment effective numbers of communities that are based on different apportionment indices. (3) “Differentiation effective numbers” of communities exist but do not conform with the classical concept of partitioning total type diversity into components within and between communities. (4) Differentiation characteristics are measured as effective numbers of distinct types (rather than communities) from the dual perspective, in which the roles of type and community membership are exchanged. This is relevant e.g. in studies of endemism and competitive exclusion. (5) For Shannon-Wiener diversity, all of the differentiation and apportionment effective numbers are equal, with the exception of those representing additive partitioning. (6) Under either perspective, that is dual or non-dual, measures of compositional differentiation (as originally suggested for the assessment of β-diversity) do not figure in the partitioning of total diversity into components, since they do not build on the intrinsic concept of diversity.
- Published
- 2016
31. Tax-induced distortions of effort and compensation in a principal-agent setting
- Author
-
Rainer Niemann, Jan Thomas Martini, and Dirk Simons
- Subjects
Net profit ,Public economics ,05 social sciences ,Principal–agent problem ,050201 accounting ,Principal-agent-problem ,Incentive ,Common Consolidated Corporate Tax Base ,Payroll ,Apportionment ,Accounting ,Common consolidated corporate tax base ,0502 economics and business ,Remuneration ,Economics ,Managerial compensation ,media_common.cataloged_instance ,Formula apportionment ,050207 economics ,European union ,Multi-jurisdictional entities ,Finance ,media_common - Abstract
Common consolidated corporate tax base (CCCTB) and tax allocation via formula apportionment (FA) are hotly debated in the European Union (EU). The objective of this paper is to analyze the tax-induced distortions of managerial incentives and remuneration packages caused by FA. We set up a LEN-type principal-agent model with agents in two different jurisdictions. There are no transactions between the two jurisdictions, thus all findings are driven by FA. If payroll enters the FA formula, the principal demands increased effort and pays an increased compensation to managers in low-tax jurisdictions compared to the benchmark case. Managers in high-tax jurisdictions face the opposite effect. Furthermore, the composition of the remuneration changes, which distorts incentives in addition to the excessive pay. Lastly, net profit increases because FA offers new potential for profit shifting. VHB-JOURQUAL-Ranking: B
- Published
- 2016
32. Global review of recent source apportionments for airborne particulate matter
- Author
-
Yinchang Feng, Qili Dai, Philip K. Hopke, and Linxuan Li
- Subjects
Source apportionment ,Environmental Engineering ,010504 meteorology & atmospheric sciences ,PM2.5 ,010501 environmental sciences ,01 natural sciences ,Article ,World health ,PM10 ,Policy decision ,Apportionment ,Adverse health effect ,Environmental Chemistry ,Waste Management and Disposal ,Air quality index ,0105 earth and related environmental sciences ,business.industry ,Particulate pollution ,Environmental resource management ,Global ,Heavy metals ,Particulates ,Pollution ,Air quality ,Environmental science ,Particulate matter ,business - Abstract
Source apportionments have become increasingly performed to determine the origins of ambient particulate pollution. The results can be helpful in designing mitigation strategies to improve air quality. Source specific particulate matter (PM) concentrations are also being used in health effects studies to be able to focus attention on those sources most likely to be responsible for the observed adverse health effects. In 2015, the World Health Organization (WHO) released its initial compilation of source apportionment studies published through August 2014. This initial database was described by Karagulian et al. (Atmospheric Environment120 (2015) 475–483). In the present report, a new compilation has been prepared of those apportionments published since 2014 through December 2019. In addition, the database has been expanded to include apportionments of heavy metals, water-soluble components, and carbonaceous components in ambient PM. As a result of this work, we have developed and presented some perspectives on source apportionment going forward. We also have made a series of recommendations for source apportionment studies and reporting them. It is essential for papers to provide a minimum set of information so that the study can be adequately assessed, and the results utilized by others in making policy decisions or as part of other scientific studies., Graphical abstract Unlabelled Image, Highlights • Comprehensive review of the PM2.5/PM10 source apportionment literature since 2014 • Includes apportionments for heavy metals, water-soluble species, and carbonaceous PM • Major increase in source apportionment studies in China • Most commonly used source apportionment method is positive matrix factoriztion
- Published
- 2020
33. Seasonal variation, source apportionment and source attributed health risk of fine carbonaceous aerosols over National Capital Region, India
- Author
-
Tuhin Kumar Mandal, Ranu Gadi, Sudhir Kumar Sharma, and Shivani
- Subjects
Environmental Engineering ,Fine particulate ,Health, Toxicology and Mutagenesis ,0208 environmental biotechnology ,Phthalic Acids ,National capital region ,India ,02 engineering and technology ,010501 environmental sciences ,01 natural sciences ,chemistry.chemical_compound ,Apportionment ,Air Pollution ,medicine ,Environmental Chemistry ,Biomass ,Organic Chemicals ,Polycyclic Aromatic Hydrocarbons ,Health risk ,Air quality index ,Vehicle Emissions ,0105 earth and related environmental sciences ,Aerosols ,Total organic carbon ,Air Pollutants ,Levoglucosan ,Public Health, Environmental and Occupational Health ,General Medicine ,General Chemistry ,Seasonality ,medicine.disease ,Pollution ,Carbon ,020801 environmental engineering ,chemistry ,Health ,Environmental chemistry ,Environmental science ,Particulate Matter ,Seasons ,Environmental Monitoring - Abstract
Deteriorating air quality with high levels of fine particulate matter (PM2.5) over National Capital Region (NCR) of India is one of the serious environmental and scientific issues. In this paper, PM2.5 samples were collected for 24 h twice or thrice a week during December 2016–December 2017 at three sites [Delhi (IG), Modinagar (MN) and Mahendragarh (HR)] over NCR to analyse the carbonaceous aerosols. Source apportionment of PM2.5 was attempted using Principal Component analysis (PCA) and Positive Matrix Factorization (PMF) based on the analysed carbonaceous fractions [Organic carbon, Elemental carbon, Secondary organic carbon (SOC)]. Organic compounds: alkanes, hopanes, steranes, polycyclic aromatic hydrocarbons (PAHs), phthalates, levoglucosan and n-alkanoic acids were analysed to distinguish the emission sources. Total Carbonaceous Aerosols (TCA) contributed significantly (∼26%) to PM2.5 which revealed their importance in source apportionment. Estimated SOC contributed 43.2%, 42.2% and 58.2% to OC and 5.4%, 5.3% and 7.8% to PM2.5 at IG, MN and HR sites respectively. PCA and PMF apportion five emission sources i.e., vehicular emissions (34.6%), biomass burning (26.8%), cooking emissions (15.7%), plastic and waste burning (13.5%) and secondary organic carbon (9.5%) for PM2.5. Source attributed health risk has also been calculated in terms of Lung cancer risk (LCR) associated with PAHs exposure and concluded that vehicular emissions (40.3%), biomass burning (38.1%), secondary organic carbon (12.8%) contributed higher to LCR (503.2 × 10−5; ∼503 cases in 1,00,000). Health risk assessment combined with source apportionment inferences signifies the immediate implementation of emissions reduction strategies with special target on transport sector and biomass burning over the NCR of India.
- Published
- 2019
34. Economic Geography, Political Inequality, and Public Goods in the Original 13 US States
- Author
-
Jeffrey L. Jensen and Pablo Beramendi
- Subjects
Politics ,State (polity) ,Apportionment ,Political science ,media_common.quotation_subject ,Legislature ,Economic geography ,Democratization ,Public good ,Independence ,Representation (politics) ,media_common - Abstract
A large and fruitful literature has focused on the impact of colonial legacies on long-term development. Yet the mechanisms through which these legacies get transmitted over time remain ambiguous. This paper analyzes the choice and effects of legislative representation as one such mechanism, driven by elites interested in maximizing jointly economic prospects and political influence over time. We focus on malapportionment in the legislatures of the original thirteen British North-American colonies. Their joint independence created a unique juncture in which postcolonial elites simultaneously chose the legislative and electoral institutions under which they would operate. We show that the initial choice of apportionment in the state legislatures is largely a function of economic geography, that such a choice generated persistent differences in representation patterns within states (political inequality), and that the latter shaped public goods provision in the long run.
- Published
- 2018
35. Cartel Damages: Liability and Settlement
- Author
-
Ben Bornemann
- Subjects
Plaintiff ,Apportionment ,Liability ,Cartel ,Damages ,media_common.cataloged_instance ,Joint and several liability ,Business ,European union ,Settlement (litigation) ,media_common ,Law and economics - Abstract
This paper relates civil joint and several liability of members of a cartel, contribution claims, liability allocation, as well as the anatomy and effects of settlements with less than all parties. Because the substantive rules of Directive 2014/104/EU (the Directive) do not apply to damages that occurred prior to its implementation, the pre-Directive legal framework will continue to be relevant for years. This work therefore analyses both the old and the new sets of rules and how the changing regimes influence incentives. Under pre-Directive law, victims of competition law infringements can claim their entire damages from any of the cartelists. Under some pre-Directive national laws, cartelists sued by the claimant might not be able to take effective recourse against members of the cartel that the claimant did not sue. The effect of a partial settlement in terms of claim reduction and liability of the settling cartelist varies depending on the jurisdictions involved and the wording chosen. Under the Directive, cartelists are also jointly liable for the entire damage caused; however, there are exceptions for the immunity recipient and small and medium enterprises that modify this basic rule. Cartelists do have effective recourse against each other. The effect of a partial settlement is uniform throughout the EU: It always reduces the plaintiff’s claim by the settling cartelist’s share of the harm, and the remaining cartelists can no longer bring contribution claims against the settling cartelist. This facilitates the conclusion of incentive-compatible settlement agreements to cooperate in pursuing the remaining cartelists for damages. This work discusses a multitude of criteria for liability apportionment amongst cartelists, leading into an assessment of two proposals that have been advanced to make the liability apportionment amongst cartelists more predictable: (A) using the ratio of the cartelists’ relative sales and (B) calculating Shapley values of the cartelists’ contributions to the cartel’s effects. I find that relative sales are a suitable starting heuristic for most cases, but there are constellations in which this method leads to absurd results. Shapley values theoretically provide the perfect metric to measure relative responsibility for the harm but require input that is usually exceedingly difficult to obtain. The final part briefly contrasts these results to the situation under US federal law and finds that the combination of no contribution rights amongst cartelists, the trebling of damages and pro tanto claim reduction create stronger incentives to settle than in Europe.
- Published
- 2018
36. Benefit-cost analysis and the construction and financing of rail/highway grade separations
- Author
-
J.S. Dodgson
- Subjects
Engineering ,Cost allocation ,Cost–benefit analysis ,business.industry ,Poison control ,Level crossing ,Cost burden ,Transport engineering ,Apportionment ,General Earth and Planetary Sciences ,Capital cost ,business ,General Environmental Science ,Valuation (finance) - Abstract
This paper is concerned with the use of benefit-cost analysis to appraise investment in the construction of grade separations to eliminate or reduce highway traffic over existing rail/highway grade crossings. The paper deals with the measurement and valuation problems involved, and presents the results of a study of five grade separations authorised in Canada. A second issue considered is that of the determination of the appropriate apportionment of capital costs among the parties, rail and highway, involved. Alternative principles for determining who should bear the cost burden of grade separations are developed, and the implications of these cost apportionment procedures for the five Canadian case studies explored.
- Published
- 1984
37. Balancing an insurance portfolio by class of business
- Author
-
Greg Taylor
- Subjects
Statistics and Probability ,Economics and Econometrics ,Actuarial science ,Apportionment ,Systematic risk ,Diversification (finance) ,Economics ,Portfolio ,Time horizon ,Profitability index ,Statistics, Probability and Uncertainty ,Modern portfolio theory ,Underwriting - Abstract
The paper considers the benefits to be gained from diversification of an insurance operation into a number of classes of business. This is formulated mathematically. It is assumed that the insurer's objective is to maximize increase in utility of wealth over the period up to his planning horizon (Section 2). Several forms of solution of this problem are considered (Section 3). In particular, it is found that there is no benefit in diversification if the parameters characterizing the stochastic properties of the claims of each class of business are known with certainty. In these circumstances, utility is maximized by underwriting all premium in that class of business projected to produce the greatest profitability. It is also seen that, unless the concept of the insurer's risk aversion is incorporated somehow, for example by means of a utility function, no benefits from diversification appear (Section 5.2). The benefits of diversification arise from uncertainty concerning the basic parameters of the claims process in the various classes of business. A parallel with the concepts of diversifiable and undiversifiable risk from Modern Portfolio Theory is pointed out (Section 4). A measure of the balance achieved within a portfolio is constructed. A somewhat simplified version of the problem is considered in Section 5, and an explicit algebraic solution obtained. Of the total budgeted premium to be underwritten in a particular year, the solution prescribes the percentages to be underwritten in the various classes of business. The solution depends upon the projected profitability and the parameters underlying the claims process in each of the classes. Some verbal interpretation of the algebraic solution is provided. That solution is applied to a realistic numerical example (Section 6). The apportionment of total budget premium by class of business arrived at in the numerical example is compared with the actual budget of a particular insurer. A number of qualifying remarks concerning the application of the results of the paper are made in Section 7.
- Published
- 1987
38. Apportionment of lumbar L2–S1 rotation across individual motion segments during a dynamic lifting task
- Author
-
Liying Zheng, Xudong Zhang, Ameet Aiyangar, and William Anderst
- Subjects
Adult ,Male ,Lifting ,Rotation ,Biomedical Engineering ,Biophysics ,Motion (geometry) ,Kinematics ,Lumbar vertebrae ,Young Adult ,Lumbar ,Apportionment ,Position (vector) ,Statistics ,medicine ,Humans ,Orthopedics and Sports Medicine ,Range of Motion, Articular ,Mathematics ,Lumbar Vertebrae ,Rehabilitation ,Lumbosacral Region ,Anatomy ,Healthy Volunteers ,Biomechanical Phenomena ,medicine.anatomical_structure ,Metric (mathematics) ,Female ,Rotation (mathematics) - Abstract
Segmental apportionment of lumbar (L2-S1) rotation is a critical input parameter for musculoskeletal models and a candidate metric for clinical assessment of spinal health, but such data are sparse. This paper aims to quantify the time-variant and load-dependent characteristics of intervertebral contributions to L2-S1 extension during a dynamic lifting task. Eleven healthy participants lifted multiple weights (4.5, 9.1, and 13.6 kg) from a trunk-flexed to an upright position while being imaged by a dynamic stereo X-ray system at 30 frames/s. Vertebral (L2-S1) motion was tracked using a previously validated volumetric model-based tracking method that employs 3D bone models reconstructed from subject-specific CT images to obtain high-accuracy (≤0.26°, 0.2 mm) 3D vertebral kinematics. Individual intervertebral motions as percentages of the total L2-S1 extension were computed at each % increment of the motion to show the segmental apportionment. Results showed L3-L4 (25.8±2.2%) and L4-L5 (31±3.1%) together contributed a larger share (∼60% combined) compared to L2-L3 (21.7±3.7%) and L5-S1 (22.6±4.7%); L4-L5 consistently provided the largest contribution of the measured segments. Relative changes over time in L3-L4 (6±12.5%) and L4-L5 (0.5±10.2%) contribution were minimal; in contrast, L2-L3 (18±20.1%) contribution increased while L5-S1 (-33±22.9%) contribution decreased in a somewhat complementary fashion as motion progressed. No significant effect of the magnitude of load lifted on individual segmental contribution patterns was detected. The current study updated the knowledge regarding apportionment of lumbar (L2-S1) motion among individual segments, serving both as input into musculoskeletal models and as potential biomechanical markers of low back disorders.
- Published
- 2015
39. Dendrohydrology in Canada’s western interior and applications to water resource management
- Author
-
Jessica Vanstone, Jeannine-Marie St. Jacques, Robert Sauchyn, and David J. Sauchyn
- Subjects
geography ,geography.geographical_feature_category ,010504 meteorology & atmospheric sciences ,0207 environmental engineering ,Drainage basin ,Climate change ,02 engineering and technology ,15. Life on land ,01 natural sciences ,6. Clean water ,Proxy (climate) ,Technical support ,13. Climate action ,Apportionment ,Streamflow ,Population growth ,020701 environmental engineering ,Water resource management ,Surface water ,0105 earth and related environmental sciences ,Water Science and Technology - Abstract
Summary Across the southern Canadian Prairies, annual precipitation is relatively low (200–400 mm) and periodic water deficits limit economic and environmental productivity. Rapid population growth, economic development and climate change have exposed this region to increasing vulnerability to hydrologic drought. There is high demand for surface water, streamflow from the Rocky Mountains in particular. This paper describes the application of dendrohydrology to water resource management in this region. Four projects were initiated by the sponsoring organizations: a private utility, an urban municipality and two federal government agencies. The fact that government and industry would initiate and fund tree-ring research indicates that practitioners recognize paleohydrology as a legitimate source of technical support for water resource planning and management. The major advantage of tree-rings as a proxy of annual and seasonal streamflow is that the reconstructions exceed the length of gauge records by at least several centuries. The extent of our network of 180 tree-ring chronologies, spanning AD 549–2013 and ∼20° of latitude, with a high density of sites in the headwaters of the major river basins, enables us to construct large ensembles of tree-ring reconstructions as a means of expressing uncertainty in the inference of streamflow from tree rings. We characterize paleo-droughts in terms of modern analogues, translating the tree-ring reconstructions from a paleo-time scale to the time frame in which engineers and planners operate. Water resource managers and policy analysts have used our paleo-drought scenarios in their various forms to inform and assist drought preparedness planning, a re-evaluation of surface water apportionment policy and an assessment of the reliability of urban water supply systems.
- Published
- 2015
40. Incentive Pricing of Shared Services with Normal Distribution of Order Flow
- Author
-
Tomáš Buus
- Subjects
Distribution center ,Service (business) ,Transfer price ,Operations research ,Total cost ,Computer science ,General Engineering ,Energy Engineering and Power Technology ,Transfer pricing ,Investment center ,Cash pool ,Incentive ,Apportionment ,Center (algebra and category theory) ,Operations management ,Shared service center - Abstract
This paper presents simple yet efficient formula for apportionment of cost generated by the variability of flow of requirements (objects, inventory, money) through the shared service center (distribution center, internal bank, some service center) as well as formula for apportionment of the cost generated by the flow, if it were steady. The presented formula assures that shared service center cost are charged fairly and provide incentive for the shared services center counterparts to optimize timing and size of their requirements towards shared services center and minimize the total cost of handling them. Additionally we challenge the marginalist transfer pricing theory.
- Published
- 2015
- Full Text
- View/download PDF
41. Taxing Intellectual Property in the Global Economy: A Plea for Regulated and Internationally Coordinated Profit Splitting
- Author
-
Wolfram F. Richter
- Subjects
Equity (economics) ,Profit (accounting) ,Common Consolidated Corporate Tax Base ,Tax competition ,business.industry ,Incentive compatibility ,Apportionment ,International trade ,Formulary apportionment ,business ,Drawback - Abstract
Inter-country equity in the taxation of IP is a contentious issue. With its BEPS initiative, the OECD aims at taxing in accordance with value creation even though there are admitted difficulties in determining the actual place of value creation. The European Commission promotes the introduction of unitary taxation. The proposal’s drawback is that it lacks incentive compatibility in information exchange. Furthermore, it stipulates a cost-dependent apportionment of the common consolidated corporate tax base that incentivizes locating R&D in low-tax countries. Against this background, this paper makes a case for an internationally regulated split of the profit earned with imported IP.
- Published
- 2017
42. The Patent Damages Gap: An Economist's Review of U.S. Patent Damages Apportionment Rules
- Author
-
Anne Layne-Farrar
- Subjects
Engineering ,Actuarial science ,business.industry ,Apportionment ,Common law ,Damages ,Patent infringement ,business ,Market value ,Supreme court - Abstract
As an economist, I find the current state of the law regarding damages for patent infringement – most particularly that relating to apportionment – frustrating at best and woefully incomplete at worst. Namely, damages case law for utility patent infringement provides two very different, but insufficient, guidance frameworks for calculating damages: the entire market value rule (EMVR) versus the smallest salable patent practicing unit (SSPPU) principle. The modern pair of EMVR and SSPPU options is far narrower than the approaches afforded by the original 1884 Supreme Court ruling establishing apportionment for damages, Garretson. In this paper, I present the economic case for expanding the allowable damages frameworks beyond EMVR or SSPPU, to fill in the gap in reasonable damages approaches created by an EMVR and SSPPU dichotomy.
- Published
- 2017
43. Toxicity evaluation and source apportionment of Polycyclic Aromatic Hydrocarbons (PAHs) at three stations in Istanbul, Turkey
- Author
-
Edip Avşar, Burcak Kaynak, Kadir Alp, and Asude Hanedar
- Subjects
Air Pollutants ,Anthracene ,Environmental Engineering ,Turkey ,Istanbul turkey ,Suspended particles ,Environmental engineering ,Pollution ,Diesel fuel ,chemistry.chemical_compound ,chemistry ,Apportionment ,Air Pollution ,Environmental chemistry ,Toxicity ,Environmental Chemistry ,Particulate Matter ,Polycyclic Aromatic Hydrocarbons ,Health risk ,Waste Management and Disposal ,Environmental Monitoring - Abstract
This paper focuses on the toxicity evaluation and source apportionment of Polycyclic Aromatic Hydrocarbons (PAHs) in three monitoring stations in Istanbul, Turkey. A total of 326 airborne samples were collected and analyzed for 16 PAHs and Total Suspended Particles (TSP) for the period of September 2006-December 2007. The total average PAH concentrations were 100.7 +/- 613, 84.6 +/- 46.7 and 25.1 +/- 13.3 ng m(-3) and the TSP concentrations were 101.2 +/- 53.2, 152.3 +/- 99.1, 49.8 +/- 18.6 mu g m(-3) for URB1, URB2 and RUR stations, respectively. Benzo(a)Pyren (BaP) toxic equivalency factors to PAH concentration values were calculated indicating that the health risk of BaP and DiBenz(a,h)Anthracene (markers of traffic emissions) have the highest contribution compared to all of the other species measured at the sampling sites. In order to determine PAH sources, two different source apportionment techniques were applied to the measurements; diagnostic ratios (DR) and Positive Matrix Factorization (PMF). The results of the two applications were compatible indicating the vehicle emissions especially diesel engines - as the major source for urban stations. (C) 2013 Elsevier B.V. All rights reserved.
- Published
- 2014
44. Source apportionment of heavy metals in farmland soil of Wuwei, China: Comparison of three receptor models
- Author
-
Yanyan Yang, Rui Zhao, Haiping Luo, Qingyu Guan, Ninghui Pan, and Feifei Wang
- Subjects
Pollution ,Pollutant ,Renewable Energy, Sustainability and the Environment ,business.industry ,020209 energy ,Strategy and Management ,media_common.quotation_subject ,05 social sciences ,Fossil fuel ,Environmental engineering ,02 engineering and technology ,Contamination ,Pesticide ,Industrial and Manufacturing Engineering ,Apportionment ,Agriculture ,050501 criminology ,0202 electrical engineering, electronic engineering, information engineering ,Environmental science ,Coal ,business ,0505 law ,General Environmental Science ,media_common - Abstract
Receptor models are, rarely utilized in soil but are often used to identify pollutant sources and quantify their contribution. This paper focuses on the soil in oasis farmland. A geochemical baseline is used to assess the pollution of the soil, and then three models are tentatively utilized to apportion heavy metals and compare the sources, the contributions and the operation effects. Pollution assessment indicated that the farmland soil of Wuwei was lightly contaminated by heavy metals. Source apportionments suggested that atmospheric deposition contributed the most pollution (53.95%–65.35%). The three models supplemented each other, and the grouped principal component analysis/absolute principal component scores (GPCA/APCS) was outstanding. GPCA/APCS and UNMIX suggested that agricultural activities were the prime anthropogenic source (51.06%–61.56%), followed by the combustion of fossil fuels (coal and oil) (27.92%–28.66%) and building materials-related activities source (10.52%–20.29%). Fertilizers and pesticides (67.88%–74.81%) contributed more than traffic emissions (25.19%–32.12%) in agricultural activities. Similar results were acquired via positive matrix factorization (PMF), while industrial activity was the highest individual contributor (29.91%). Therefore, combining these three models was the most effective approach, and more attention should be paid to mitigating the pollution caused by the use of fertilizers and pesticides as well as the industrial activities in Wuwei. The results of this study could provide reference in reduction of heavy metal pollution in farmland soil.
- Published
- 2019
45. 'Zero'
- Author
-
Vijay Nambi, Christie M. Ballantyne, Peter B. Jones, and Salim S. Virani
- Subjects
Gerontology ,03 medical and health sciences ,0302 clinical medicine ,business.industry ,Apportionment ,Medicine ,030212 general & internal medicine ,030204 cardiovascular system & hematology ,Cardiology and Cardiovascular Medicine ,business ,Zero (linguistics) - Abstract
We read with great interest the paper by Mortensen et al. [(1)][1] on negative risk markers and cardiovascular events in older adults (age >65 years) published in the Journal . “Derisking” and appropriate apportionment of therapies allow clinicians and patients to withhold or de-escalate
- Published
- 2019
46. A method for the source apportionment in bathing waters through the modelling of wastewater discharges: Development of an indicator and application to an urban beach in Santander (Northern Spain)
- Author
-
Andrés García, José A. Revilla, Javier F. Bárcena, César Álvarez, José L. Gil, and Iago López
- Subjects
Pollution ,Ecology ,Bathing ,Sanitation ,media_common.quotation_subject ,Stormwater ,General Decision Sciences ,Wastewater ,Apportionment ,Environmental science ,Hydrometeorology ,Water resource management ,Effluent ,Ecology, Evolution, Behavior and Systematics ,media_common - Abstract
The approval of the current Bathing Water Directive (Directive 2006/7/EC) involves a set of changes regarding to its predecessor. One of the most important ones is the necessity to define bathing water profiles. These profiles should identify putative pollution sources (“source apportionment”). In urban beaches, bacteriological pollution problems depend, in most instances, on the operation of the sanitation system. Due to the novelty of the bathing water profile concept, the methodologies proposed in the literature to solve this issue are still scarce. This paper presents a methodology to assess the percentage in which a bacteriological concentration is exceeded, according to the Directive 2006/7/EC standards, which could be a suitable indicator of bacteriological pollution in bathing waters, due to it is related with the probability to impair the Directive 2006/7/EC regulatory standards. This indicator would be useful for the “source apportionment” in bathing waters and, consequently, for the definition of the bathing water profile, and can be used in combination with other sources of information, especially those provided by the analysis of extreme events, maybe more interesting for the managers. The methodology involves the characterisation of effluents under different hydrometeorological conditions, the characterisation of advection, diffusion and reaction processes of bacteriological organisms in the aquatic environment by using mathematical models, in which calibration and sensitivity analysis were carried out. The outcome of the application of this methodology in an urban beach located in Santander (Northern Spain) is also shown. To define the bathing water profile we have considered the sanitation system operating under normal conditions, which include storm water overflows (CSOs), and the uncontrolled discharges.
- Published
- 2013
47. Reliability apportionment approach for spacecraft solar array using fuzzy reasoning Petri net and fuzzy comprehensive evaluation
- Author
-
Jianing Wu, Shaoze Yan, Peng Gao, and Liyang Xie
- Subjects
Mechanical system ,Engineering ,Spacecraft ,business.industry ,Apportionment ,Photovoltaic system ,Synchronization (computer science) ,Aerospace Engineering ,Petri net ,business ,Fuzzy logic ,Reliability (statistics) ,Reliability engineering - Abstract
The reliability apportionment of spacecraft solar array is of significant importance for spacecraft designers in the early stage of design. However, it is difficult to use the existing methods to resolve reliability apportionment problem because of the data insufficiency and the uncertainty of the relations among the components in the mechanical system. This paper proposes a new method which combines the fuzzy comprehensive evaluation with fuzzy reasoning Petri net (FRPN) to accomplish the reliability apportionment of the solar array. The proposed method extends the previous fuzzy methods and focuses on the characteristics of the subsystems and the intrinsic associations among the components. The analysis results show that the synchronization mechanism may obtain the highest reliability value and the solar panels and hinges may get the lowest reliability before design and manufacturing. Our developed method is of practical significance for the reliability apportionment of solar array where the design information has not been clearly identified, particularly in early stage of design.
- Published
- 2012
48. A divisor apportionment method based on the Kolm–Atkinson social welfare function and generalized entropy
- Author
-
Junichiro Wada
- Subjects
Stolarsky mean ,Sociology and Political Science ,Logarithmic mean ,Apportionment ,Economics ,General Social Sciences ,Social welfare maximization ,Statistics, Probability and Uncertainty ,Mathematical economics ,Social welfare function ,General Psychology ,Entropy minimization - Abstract
This paper links Stolarsky mean apportionment methods, which include the US House of Representatives, the Saint-Lague, and the d’Hondt methods, to Kolm–Atkinson social welfare maximization and to generalized entropy minimization. Within this class, the logarithmic mean apportionment method is the most unbiased one that assigns at least one seat to each state.
- Published
- 2012
49. Evaluating competing criteria for allocating parliamentary seats
- Author
-
Patrick Bernhagen, Richard Rose, and Gabriela Borz
- Subjects
Sociology and Political Science ,business.industry ,Parliament ,Compromise ,media_common.quotation_subject ,General Social Sciences ,Distribution (economics) ,Public administration ,Politics ,Apportionment ,Economics ,Normative ,Resizing ,Statistics, Probability and Uncertainty ,Treaty ,business ,General Psychology ,media_common ,Law and economics - Abstract
In an established parliament any proposal for the allocation of seats will affect sitting members and their parties and is therefore likely to be evaluated by incumbents in terms of its effects on the seats that they hold. This paper evaluates the Cambridge Compromise’s formula in relation to compromises between big and small states that have characterised the EU since its foundation. It also evaluates the formula by the degree to which the Compromise departs from normative standards of equality among citizens and its distribution of seats creates more anxiety about the risks of losses as against hypothetical gains. These political criteria explain the objections to the Cambridge Compromise. However, the pressure to change the allocation of seats is continuing with EU enlargement and the arbitrary ceiling of 751 seats imposed by the Lisbon Treaty.
- Published
- 2012
50. Corrigendum to 'A source apportionment of U.S. fine particulate matter air pollution'[Atmos. Environ. 45/24 (2011) 3924–3936]
- Author
-
George D. Thurston, Kazuhiko Ito, and Ramona Lall
- Subjects
Atmospheric Science ,Component (thermodynamics) ,Fine particulate ,Trace element ,Air pollution ,Environmental engineering ,Atmospheric sciences ,medicine.disease_cause ,k-nearest neighbors algorithm ,chemistry.chemical_compound ,Variable (computer science) ,chemistry ,Apportionment ,medicine ,Environmental science ,Nitrogen dioxide ,General Environmental Science - Abstract
Subsequent to this manuscript going to press, we discovered that approximately one-third of the nitrogen dioxide (NO2) data downloaded from the Health Effects Institute (HEI)/Atmospheric and Environmental Research (AER) database for the “nearest neighbor” site (16,641 of 46,478 observations) were actually from sites outside the metropolitan statistical area (MSA) where the Chemical Speciation Network (CSN) site being considered was located. In this Corrigendum, we evaluate what aspects of the reported results were affected, and re-estimate the source impacts and profiles after addressing this NO2 data issue. To evaluate the importance of this NO2 data issue to the initial factor analysis, we repeated the factor analysis: 1) for the entire dataset considering only trace element data (i.e., without the NO2 variable) (n 1⁄4 46,478 observations); and, 2) for the subset of the data having NO2 data fromwithin the same MSA as the CSN site (n 1⁄4 29,837 observations). In the first case, all source-related factors except for Traffic were reproduced with negligible change, producing correlations to the original factors as follows: Crustal/Soil r 1⁄4 1.00, Metals-Related r 1⁄4 0.99, Salt Aerosols r1⁄4 1.00, Residual Oil r1⁄4 0.98, Steel Industry r1⁄4 1.00, Coal-Burning r1⁄4 0.99, and Biomass burning r1⁄4 0.98. However, the Traffic factor without the NO2 variable had a distinctly lower correlation with the original Traffic factor (i.e., r 1⁄4 0.86). As previously noted in the original paper, “Without the inclusion of NO2 to the analysis, it was not possible for the factor analysis to separate a distinct traffic component”. Clearly, the Traffic component was most notably affected by the elimination of NO2 variable because it was the component with the largest NO2 loading in the original analysis. In our follow up factor analysis (for the subset with within-MSA NO2 data, n 1⁄4 29,837 observations), we found that the Traffic component was again resolved, just as in the original analysis (r 1⁄4 0.95 with the original scores for these 29,837 data observations), indicating that the original Traffic component was not an artifact of the non-MSA NO2 data, and that it was a valid representation of the Traffic component, with the exception of the observations for out-of-MSA NO2 values.
- Published
- 2012
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.