87 results
Search Results
2. Statistical Models for Political Science Event Counts: Bias in Conventional Procedures and Evidence for the Exponential Poisson Regression Model
- Author
-
King, Gary
- Abstract
This paper presents analytical, Monte Carlo, and empirical evidence on models for event count data. Event counts are dependent variables that measure the number of times some event occurs. Counts of international events are probably the most common, but numerous examples exist in every empirical field of the discipline. The results of the analysis below strongly suggest that the way event counts have been analyzed in hundreds of important political science studies have produced statisti- cally and substantively unreliable results. Misspecification, inefficiency, bias, inconsistency, insufficiency, and other problems result from the unknowing application of two common methods that are without theoretical justification or empirical utility in this type of data. I show that the exponential Poisson regression (EPR) model provides analytically, in large samples, and empirically, in small, finite samples, a far superior model and optimal estimator. I also demonstrate the advantage of this methodology in an application to nineteenth-century party switching in the U.S. Congress. Its use by political scientists is strongly encouraged., Government
- Published
- 1988
- Full Text
- View/download PDF
3. Stock Prices, Earnings, and Expected Dividends
- Author
-
Shiller, Robert and Campbell, John
- Abstract
Long historical averages of real earnings help forecast present values of future real dividends. With aggregate U.S. stock market data (1871-1986), a vector-autoregressive forecast of the present value of future dividends is, for each year, roughly a weighted average of moving-average earnings and current real price, with between two thirds and three fourths of the weight on the earnings measure. We develop the implications of this for the present-value model of stock prices and for recent results that long-horizon stock returns are highly forecastable., Economics
- Published
- 1988
- Full Text
- View/download PDF
4. Representation through Legislative Redistricting: A Stochastic Model
- Author
-
King, Gary
- Abstract
This paper builds a stochastic model of the processes that give rise to observed patterns of representation and bias in congressional and state legislative elections. The analysis demonstrates that partisan swing and incumbency voting, concepts from the congressional elections literature, have determinate effects on representation and bias, concepts from the redistrictihg literature. The model shows precisely how incumbency and increased variability of partisan swing reduce the responsiveness of the electoral system and how partisan swing affects whether the system is biased toward one party or the other. Incumbency, and other causes of unresponsive representation, also reduce the effect of partisan swing on current levels of partisan bias. By relaxing the restrictive portions of the widely applied "uniform partisan swing" assumption, the theoretical analysis leads directly to an empirical model enabling one more reliably to estimate responsiveness and bias from a single year of electoral data. Applying this to data from seven elections in each of six states, the paper demonstrates that redistricting has effects in predicted directions in the short run: partisan gerrymandering biases the system in favor of the party in control and, by freeing up seats held by opposition party incumbents, increases the system's responsiveness. Bipartisan-controlled redistricting appears to reduce bias somewhat and dramatically to reduce responsiveness. Nonpartisan redistricting processes substantially increase responsiveness but do not have as clear an effect on bias. However, after only two elections, prima facie evidence for redistricting effects evaporate in most states. Finally, across every state and type of redistricting process, responsiveness declined significantly over the course of the decade. This is clear evidence that the phenomenon of "vanishing marginals," recognized first in the U.S. Congress literature, also applies to these different types of state legislative assemblies. It also strongly suggests that redistricting could not account for this pattern., Government
- Published
- 1989
- Full Text
- View/download PDF
5. Variance Specification in Event Count Models: From Restrictive Assumptions to a Generalized Estimator
- Author
-
King, Gary
- Abstract
This paper discusses the problem of variance specification in models for event count data. Event counts are dependent variables that can take on only nonnegative integer values, such as the number of wars or coups d'etat in a year. I discuss several generalizations of the Poisson regression model, presented in King (1988), to allow for substantively interesting stochastic processes that do not fit into the Poisson framework. Individual models that cope with, and help analyze, heterogeneity, contagion. and negative contagion are each shown to lead to specific statistical models for event count data. In addition. I derive a new generalized event count (GEC) model that enables researchers to extract significant amounts of new information from existing data by estimating features of these unobserved substantive processes. Applications of this model to congressional challenges of presidential vetoes and superpower conflict demonstrate the dramatic advantages of this approach., Government
- Published
- 1989
- Full Text
- View/download PDF
6. A Unified Model of Cabinet Dissolution in Parliamentary Democracies
- Author
-
King, Gary, Alt, James E., Burns, Nancy, and Laver, Michael
- Abstract
The literature on cabinet duration is split between two apparently irreconcilable positions. The attributes theorists seek to explain cabinet duration as a fixed function of measured explanatory variables, while the events process theorists model cabinet durations as a product of purely stochastic processes. In this paper we build a unified statistical model that combines the insights of these previously distinct approaches. We also generalize this unified model, and all previous models, by including (1) a stochastic component that takes into account the censoring that occurs as a result of governments lasting to the vicinity of the maximum constitutional interelection period, (2) a systematic component that precludes the possibility of negative duration predictions, and (3) a much more objective and parsimonious list of explanatory variables, the explanatory power of which would not be improved by including a list of indicator variables for individual countries., Government
- Published
- 1990
- Full Text
- View/download PDF
7. Estimating Incumbency Advantage Without Bias
- Author
-
King, Gary and Gelman, Andrew
- Abstract
In this paper we prove theoretically and demonstrate empirically that all existing measures of incumbency advantage in the congressional elections literature are biased or inconsistent. We then provide an unbiased estimator based on a very simple linear regression model. We apply this new method to congressional elections since 1900, providing the first evidence of a positive incumbency advantage in the first half of the century., Government
- Published
- 1990
- Full Text
- View/download PDF
8. The Arctic Boundary Layer Expedition (ABLE 3A): July–August 1988
- Author
-
Harriss, R. C., Wofsy, Steven Charles, Bartlett, D. S., Shipham, M. C., Jacob, Daniel James, Hoell, J. M., Bendura, R. J., Drewry, J. W., McNeal, R. J., Navarro, R. L., Gidge, R. N., and Rabine, V. E.
- Abstract
The Arctic Boundary Layer Expedition (ABLE 3A) used measurements from ground, aircraft, and satellite platforms to characterize the chemistry and dynamics of the lower atmosphere over Arctic and sub-Arctic regions of North America during July and August 1988. The primary objectives of ABLE 3A were to investigate the magnitude and variability of methane emissions from the tundra ecosystem, and to elucidate factors controlling ozone production and destruction in the Arctic atmosphere. This paper reports the experimental design for ABLE 3A and a summary of results. Methane emissions from the tundra landscape varied widely from −2.1 to 426 mg CH4 m−2 d−1. Soil moisture and temperature were positively correlated with methane emission rates, indicating quantitative linkages between seasonal climate variability and soil metabolism. Enclosure flux measurement techniques, tower-based eddy correlation, and airborne eddy correlation flux measurements all proved robust for application to methane studies in the tundra ecosystem. Measurements and photochemical modeling of factors involved in ozone production and destruction validated the hypothesized importance of low NOx concentrations as a dominant factor in maintaining the pristine Arctic troposphere as an ozone sink. Stratospheric intrusions, long-range transport of mid-latitude pollution, forest fires, lightning, and aircraft are all potential sources of NOx and NOy to Arctic and sub-Arctic regions. ABLE 3A results indicate that human activities may have already enhanced NOy inputs to the region to the extent that the lifetime of O3 against photochemical loss may have already doubled. A doubling of NOx concentration from present levels would lead to net photochemical production of O3 during summer months in the Arctic (Jacob et al., this issue (a)). The ABLE 3A results indicate that atmospheric chemical changes in the northern high latitudes may serve as unique early warning indicators of the rates and magnitude of global environmental change., Engineering and Applied Sciences
- Published
- 1992
- Full Text
- View/download PDF
9. Passive tracer transport relevant to the TRACE A experiment
- Author
-
Krishnamurti, T. N., Sinha, M. C., Kanamitsu, M., Oosterhof, D., Fuelberg, H., Chatfield, R., Jacob, Daniel James, and Logan, Jennifer A.
- Abstract
This paper explores some of the mechanisms governing the accumulation of passive tracers over the tropical southern Atlantic Ocean during the northern hemisphere fall season. There has been a pioneering observation regarding ozone maxima over the South Atlantic during austral spring. The understanding of the formation of this maxima has been the prime motivation for this study. Using a global model as a frame of reference, we have carried out three kinds of experiments during the period of the Transport and Atmospheric Chemistry Near the Equator-Atlantic (TRACE A) project of 1992. The first of these is a simple advection of total ozone (a passive tracer) in time using the Florida State University global spectral model. Integration over the period of roughly 1 week showed that the model quite closely replicates the behavior of the observed total ozone from the total ozone mapping spectrometer (TOMS). This includes many of the changes in the features of total ozone over the tropical and subtropical region of the southern Atlantic Ocean. These studies suggest a correlation of 0.8 between the observed ozone over this region and ozone modeled from “dynamics alone,” i.e., without recourse to any photochemistry. The second series of experiments invoke sustained sources of a tracer over the biomass burn region of Africa and Brazil. Furthermore, sustained sources were also introduced in the active frontal “descending air” region of the southern hemisphere and over the Asian monsoon's east-west circulation. These experiments strongly suggest that air motions help to accumulate tracer elements over the tropical southern Atlantic Ocean. A third series of experiments address what may be required to improve the deficiencies of the vertical stratification of ozone predicted by the model over the flight region of the tropical southern Atlantic during TRACE A. Here we use the global model to optimally derive plausible accumulation of burn elements over the fire count regions of Brazil and Africa to provide passive tracer advections to closely match what was observed from reconnaissance aircraft-based measurements of ozone over the tropical southern Atlantic Ocean., Engineering and Applied Sciences
- Published
- 1996
- Full Text
- View/download PDF
10. A Survey of Corporate Governance
- Author
-
Shleifer, Andrei and Vishny, Robert W.
- Abstract
This paper surveys research on corporate governance, with special attention to the importance of legal protection of investors and of ownership concentration in corporate governance systems around the world., Economics
- Published
- 1997
- Full Text
- View/download PDF
11. Good News for Value Stocks: Further Evidence on Market Efficiency
- Author
-
LaPorta, Rafael, Lakonishok, Josef, Shleifer, Andrei, and Vishny, Robert
- Abstract
This paper examines the hypothesis that the superior return to so-called value stocks is the result of expectational errors made by investors. We study stock price reactions around earnings announcements for value and glamour stock over a 5 year period after portfolio formation. The announcement returns suggest that a significant portion of the return difference between value and glamour stocks is attributable to earnings surprises that are systematically more positive for value stocks. The evidence is inconsistent with a risk-based explanation for the return differential., Economics
- Published
- 1997
- Full Text
- View/download PDF
12. Uncovering Some Causal Relationships Between Productivity Growth and the Structure of Economic Fluctuations: A Tentative Survey
- Author
-
Aghion, Philippe and Saint-Paul, Gilles
- Abstract
This paper discusses recent theoretical and empirical work on the interactions between growth and business cycles. One may distinguish two very different types of approaches to the problem of the influence of macroeconomic fluctuations on long-run growth. In the first type of approach, which relies on learning by doing mechanisms or aggregate demand externalities, productivity growth and direct production activities are complements. An expansion therefore has a positive long-run effect on total factor productivity. In the second type of approach, hereafter labeled 'opportunity cost or 'learning-by-doing', productivity growth and production activities are substitutes. The opportunity cost of some productivity improving activities falls in a recession, which has a long-run positive impact on output. This does not mean, however, that recessions should on average last longer or be more frequent, since the expectation of future recessions reduces today's incentives for productivity growth. We also briefly discuss some empirical work which is mildly supportive of the opportunity cost approach, while showing that it can be reconciled with the observed pro-cyclical behavior of measured total factor productivity. We also describe some theoretical work on the effects of growth on business cycles., Economics
- Published
- 1998
- Full Text
- View/download PDF
13. Agency Problems and Dividend Policies around the World
- Author
-
La Porta, Rafael, Lopez-de-Silanes, Florencio, Shleifer, Andrei, and Vishny, Robert W.
- Abstract
This paper outlines and tests two agency models of dividends. According to the “outcome” model, dividends are the result of effective pressure by minority shareholders to force corporate insiders to disgorge cash. According to the “substitute” model, insiders interested in issuing equity in the future choose to pay dividends to establish a reputation for decent treatment of minority shareholders. The first model predicts that stronger minority shareholder rights should be associated with higher dividend payouts; the second model predicts the opposite. Tests on a cross-section of 4,000 companies from 33 countries with different levels of minority shareholder rights support the outcome agency model of dividends., Economics
- Published
- 2000
- Full Text
- View/download PDF
14. Distribution and fate of selected oxygenated organic species in the troposphere and lower stratosphere over the Atlantic
- Author
-
Singh, H., Chen, Y., Tabazadeh, A., Fukui, Y., Bey, I., Yantosca, Robert M., Jacob, Daniel James, Arnold, F., Wohlfrom, K., Atlas, E., Flocke, F., Blake, D., Blake, N., Heikes, B., Snow, J., Talbot, R., Gregory, G., Sachse, G., Vay, S., and Kondo, Yasuyuki
- Abstract
A large number of oxygenated organic chemicals (peroxyacyl nitrates, alkyl nitrates, acetone, formaldehyde, methanol, methylhydroperoxide, acetic acid and formic acid) were measured during the 1997 Subsonic Assessment (SASS) Ozone and Nitrogen Oxide Experiment (SONEX) airborne field campaign over the Atlantic. In this paper, we present a first picture of the distribution of these oxygenated organic chemicals (Ox-organic) in the troposphere and the lower stratosphere, and assess their source and sink relationships. In both the troposphere and the lower stratosphere, the total atmospheric abundance of these oxygenated species (ΣOx-organic) nearly equals that of total nonmethane hydrocarbons (ΣNMHC), which have been traditionally measured. A sizable fraction of the reactive nitrogen (10–30%) is present in its oxygenated organic form. The organic reactive nitrogen fraction is dominated by peroxyacetyl nitrate (PAN), with alkyl nitrates and peroxypropionyl nitrate (PPN) accounting for <5% of total NOy. Comparison of observations with the predictions of the Harvard three-dimensional global model suggests that in many key areas (e.g., formaldehyde and peroxides) substantial differences between measurements and theory are present and must be resolved. In the case of CH3OH, there appears to be a large mismatch between atmospheric concentrations and estimated sources, indicating the presence of major unknown removal processes. Instrument intercomparisons as well as disagreements between observations and model predictions are used to identify needed improvements in key areas. The atmospheric chemistry and sources of this group of chemicals is poorly understood even though their fate is intricately linked with upper tropospheric NOx and HOx cycles., Engineering and Applied Sciences
- Published
- 2000
- Full Text
- View/download PDF
15. Government Ownership of Banks
- Author
-
La Porta, Rafael, Lopez-De-Silanes, Florencio, and Shleifer, Andrei
- Abstract
In this paper, we investigate a neglected aspect of financial systems of many countries around the world: government ownership of banks. We assemble data which establish four findings. First, government ownership of banks is large and pervasive around the world. Second, such ownership is particularly significant in countries with low levels of per capita income, underdeveloped financial systems, interventionist and inefficient governments, and poor protection of property rights. Third, government ownership of banks is associated with slower subsequent financial development. Finally, government ownership of banks is associated with lower subsequent growth of per capita income, and in particular with lower growth of productivity rather than slower factor accumulation. This evidence is inconsistent with the optimistic “development” theories of government ownership of banks common in the 1960s, but supports the more recent “political” theories of the effects of government ownership of firms., Economics
- Published
- 2002
- Full Text
- View/download PDF
16. Object Space EWA Surface Splatting: A Hardware Accelerated Approach to High Quality Point Rendering
- Author
-
Ren, Liu, Pfister, Hanspeter, and Zwicker, Matthias
- Abstract
Elliptical weighted average (EWA) surface splatting is a technique for high quality rendering of point-sampled 3D objects. EWA surface splatting renders water-tight surfaces of complex point models with high quality, anisotropic texture filtering. In this paper we introduce a new multi-pass approach to perform EWA surface splatting on modern PC graphics hardware, called object space EWA splatting. We derive an object space formulation of the EWA filter, which is amenable for acceleration by conventional triangle-based graphics hardware. We describe how to implement the object space EWA filter using a two pass rendering algorithm. In the first rendering pass, visibility splatting is performed by shifting opaque surfel polygons backward along the viewing rays, while in the second rendering pass view-dependent EWA prefiltering is performed by deforming texture mapped surfel polygons. We use texture mapping and alpha blending to facilitate the splatting process. We implement our algorithm using programmable vertex and pixel shaders, fully exploiting the capabilities of today’s graphics processing units (GPUs). Our implementation renders up to 3 million points per second on recent PC graphics hardware, an order of magnitude more than a pure software implementation of screen space EWA surface splatting., Engineering and Applied Sciences
- Published
- 2002
- Full Text
- View/download PDF
17. Submajority Rules: Forcing Accountability upon Majorities
- Author
-
Vermeule, Cornelius Adrian
- Abstract
Legal and political theory have paid a great deal of attention to supermajority rules, which require a fraction of votes greater than 1/2+1 to reach a decision, and thus empower a minority to block change. In this paper I consider the opposite deviation from simple majority rule: submajority rules, under which a voting minority is granted the affirmative power to change the status quo. Among the examples I will consider are: - The Journal Clause, which allows 1/5 of the legislators present in either House to force a roll-call vote; - The discharge rule in the House, which (at various points, although not today) has permitted a specified minority of legislators to force bills out of committee for consideration on the floor; - Senate Rule XXII, under which a cloture petition is valid when signed by sixteen Senators; - The Seven Member Rule, under which a minority of designated committees in the House and Senate can require the executive branch to divulge information; - House Rule XI, which entitles committee minorities to call witnesses at hearings; - The famous Rule of Four that allows four Justices to grant a writ of certiorari and thereby put a case on the Supreme Court's agenda; - Rules governing direct democracy that permit a defined minority of a state's electorate to place a question on the ballot, or to force a recall election; - Rules governing international organizations, which frequently allow a defined minority to call an emergency session or to force a roll-call vote. Submajority rules are rarely discussed, either because they are assumed not to exist, or because they are assumed to lack any institutional virtues, or because submajoritarian decisions are assumed to be chronically unstable in light of the risk that subsequent majorities will reverse the submajority's decision. I will dispute all three assumptions. Submajority rules have important procedural and deliberative virtues: in a range of situations they enable a minority to force public accountability upon a majority, to the benefit of the institution as a whole. The reversibility problem can be, and is, dampened by other institutional rules and norms that protect submajoritarian decisions, or by the simpler expedient of adopting submajority rules only for decisions that are inherently irreversible or costly to reverse, such as decisions that release information into the public domain.
- Published
- 2005
18. The Sloan Digital Sky Survey monitor telescope pipeline
- Author
-
Tucker, D.L., Kent, S., Richmond, M.W., Annis, J., Smith, J.A., Allam, S.S., Rodgers, C.T., Stute, J.L., Adelman-McCarthy, J.K., Brinkmann, J., Doi, M., Finkbeiner, Douglas, Fukugita, M., Goldston, J., Greenway, B., Gunn, J.E., Hendry, J.S., Hogg, D.W., Ichikawa, S.-I., Ivezić, Ž., Knapp, G.R., Lampeitl, H., Lee, B.C., Lin, H., McKay, T.A., Merrelli, A., Munn, J.A., Neilsen, E.H., Newberg, H.J., Richards, G.T., Schlegel, D.J., Stoughton, C., Uomoto, A., and Yanny, B.
- Subjects
methods: data analysis ,techniques: image processing ,techniques: photometric ,surveys - Abstract
The photometric calibration of the Sloan Digital Sky Survey (SDSS) is a multi-step process which involves data from three different telescopes: the 1.0-m telescope at the US Naval Observatory (USNO), Flagstaff Station, Arizona (which was used to establish the SDSS standard star network); the SDSS 0.5-m Photometric Telescope (PT) at the Apache Point Observatory (APO), New Mexico (which calculates nightly extinctions and calibrates secondary patch transfer fields); and the SDSS 2.5-m telescope at APO (which obtains the imaging data for the SDSS proper). In this paper, we describe the Monitor Telescope Pipeline, MTPIPE, the software pipeline used in processing the data from the single-CCD telescopes used in the photometric calibration of the SDSS (i.e., the USNO 1.0-m and the PT). We also describe transformation equations that convert photometry on the USNO-1.0m u'g'r'i'z' system to photometry the SDSS 2.5m ugriz system and the results of various validation tests of the MTPIPE software. Further, we discuss the semi-automated PT factory, which runs MTPIPE in the day-to-day standard SDSS operations at Fermilab. Finally, we discuss the use of MTPIPE in current SDSS-related projects, including the Southern u'g'r'i'z' Standard Star project, the u'g'r'i'z' Open Star Clusters project, and the SDSS extension (SDSS-II)., Astronomy
- Published
- 2006
- Full Text
- View/download PDF
19. Credit Constraints as a Barrier to the Entry and Post-Entry Growth of Firms
- Author
-
Scarpetta, Stefano, Fally, Thibault, and Aghion, Philippe
- Subjects
micro data ,firm size ,post-entry growth ,entry ,financial development - Abstract
Advanced market economies are characterized by a continuous process of creative destruction. Market forces and technological developments play a major role in shaping this process, but institutional and policy settings also influence firms' decision to enter, to expand if successful and to exit if competition becomes unbearable. In this paper we focus on the effects of financial development on the entry of new firms and the expansion of successful new businesses. Drawing from harmonized firm-level data for 16 industrialized and emerging economies, we find that access to finance matters most for the entry of small firms and in sectors that are more dependent upon external finance. This finding is robust to controlling for other potential entry barriers (labour market regulations and entry regulations). On the other hand, financial development has either no effect or a negative effect on entry by large firms. Access to finance also helps new firms expand if successful. Both private credit and stock market capitalization are important for promoting entry and post-entry growth of firms. Altogether, these results suggest that, despite significant progress over the past decade, many countries, including those in Continental Europe, should improve their financial markets so as to get the most out of creative destruction, by encouraging the entry of new (especially small) firms and the post-entry growth of successful young businesses., Economics
- Published
- 2007
- Full Text
- View/download PDF
20. Discrepancies Between Score Trends from NAEP and State Tests: A Scale-Invariant Perspective
- Author
-
Ho, Andrew Dean
- Abstract
State test score trends are widely interpreted as indicators of educational improvement. To validate these interpretations, state test score trends are often compared to trends on other tests such as the National Assessment of Educational Progress (NAEP). These comparisons raise serious technical and substantive concerns. Technically, the most commonly used trend statistics – for example, the change in the percent of proficient students – are misleading in the context of cross-test comparisons. Substantively, it may not be reasonable to expect that NAEP and state test score trends should be similar. This paper motivates then applies a “scale-invariant” framework for cross-test trend comparisons to compare “high-stakes” state test score trends from 2003 to 2005 to NAEP trends over the same period. Results show that state trends are significantly more positive than NAEP trends. The paper concludes with cautions against the positioning of trend discrepancies in a framework where only one trend is considered “true.”
- Published
- 2007
- Full Text
- View/download PDF
21. Rethinking the Context of Production through an Archaeological Study of Ancient Salt Production in the Sichuan Basin, China
- Author
-
Flad, Rowan K.
- Subjects
Three Gorges ,specialization ,salt production ,context ,value - Abstract
Excavations at a salt-production site named Zhongba in the Three Gorges region of China document a complex system of intersecting activities that changed gradually over a long period of time. As the salt production became much larger in scale during the Bronze Age, the context of this production shifted from one for which there is no archaeological evidence for attachment between producers and those who control the products to a situation in which the exchange of salt to other regions seems to have been controlled, or at least directed, by an emergent elite whose authority was based in part on divinatory ability and the control of ritual knowledge. This study examines the concept of context in relation to the organization of salt production at this site and argues that multiple lines of evidence must be considered if we are to avoid simplified assumptions concerning the nature of products and the production processes through which they are made., Anthropology
- Published
- 2007
- Full Text
- View/download PDF
22. Commentary: All species are important, but some species are more important than others
- Author
-
Ellison, Aaron M. and Degrassi, Allyson L.
- Abstract
Foundation species control biodiversity and ecosystem processes, but are difficult to identify. In this issue of Journal of Vegetation Science, Elumeeva et al. show that Festuca varia and Nardus stricta act as foundation species in the Caucasus’ alpine. This paper augments the piecemeal literature on foundation species while highlighting the need for more comprehensive approaches to their identification and conservation., Organismic and Evolutionary Biology
- Published
- 2008
- Full Text
- View/download PDF
23. Disparities in Defining Disparities: Statistical Conceptual Frameworks
- Author
-
Chen, Chih-nan, Duan, Naihua, Meng, Xiao-Li, Lin, Julia Y., and Alegria, Margarita
- Subjects
mental health ,counterfactual populations ,potential outcomes ,disparities ,weighting ,Simpson's paradox - Abstract
Motivated by the need to meaningfully implement the Institute of Medicine's (IOM's) definition of health care disparity, this paper proposes statistical frameworks that lay out explicitly the needed causal assumptions for defining disparity measures. Our key emphasis is that a scientifically defensible disparity measure must take into account the direction of the causal relationship between allowable covariates that are not considered to be contributors to disparity and non-allowable covariates that are considered to be contributors to disparity, to avoid flawed disparity measures based on implausible populations that are not relevant for clinical or policy decisions. However, these causal relationships are usually unknown and undetectable from observed data. Consequently, we must make strong causal assumptions in order to proceed. Two frameworks are proposed in this paper, one is the conditional disparity framework under the assumption that allowable covariates impact non-allowable covariates but not vice versa. The other is the marginal disparity framework under the assumption that non-allowable covariates impact allowable ones but not vice versa. We establish theoretical conditions under which the two disparity measures are the same and present a theoretical example showing that the difference between the two disparity measures can be arbitrarily large. Using data from the Collaborative Psychiatric Epidemiology Survey, we also provide an example where the conditional disparity is misled by Simpson's paradox, whereas the marginal disparity approach handles it correctly., Statistics
- Published
- 2008
- Full Text
- View/download PDF
24. In Search of Distress Risk
- Author
-
Szilagyi, Jan, Hilscher, Jens, and Campbell, John
- Abstract
This paper explores the determinants of corporate failure and the pricing of financially distressed stocks whose failure probability, estimated from a dynamic logit model using accounting and market variables, is high. Since 1981, financially distressed stocks have delivered anomalously low returns. They have lower returns but much higher standard deviations, market betas, and loadings on value and small-cap risk factors than stocks with low failure risk. These patterns are more pronounced for stocks with possible informational or arbitrage-related frictions. They are inconsistent with the conjecture that the value and size effects are compensation for the risk of financial distress., Economics
- Published
- 2008
- Full Text
- View/download PDF
25. Why the Euro Will Rival the Dollar
- Author
-
Chinn, Menzie and Frankel, Jeffrey A.
- Abstract
The euro has arisen as a credible eventual competitor to the dollar as leading international currency, much as the dollar rose to challenge the pound 70 years ago. This paper uses econometrically-estimated determinants of the shares of major currencies in the reserve holdings of the world’s central banks. Significant factors include: size of the home country, rate of return, and liquidity in the relevant home financial center (as measured by the turnover in its foreign exchange market). There is a tipping phenomenon, but changes are felt only with a long lag (we estimate a weight on the preceding year’s currency share around .9). The equation correctly predicts out-of-sample a (small) narrowing in the gap between the dollar and euro over the period 1999-2007. This paper updates calculations regarding possible scenarios for the future. We exclude the scenario where the United Kingdom joins euroland. But we do take into account of the fact that London has nonetheless become the de facto financial center of the euro, more so than Frankfurt. We also assume that the dollar continues in the future to depreciate at the trend rate that it has shown on average over the last 20 years. The conclusion is that the euro may surpass the dollar as leading international reserve currency as early as 2015.
- Published
- 2008
26. A Theory of Liquidity and Regulation of Financial Intermediation
- Author
-
Tsyvinski, Aleh, Golosov, Mikhail, and Farhi, Emmanuel
- Subjects
mechanism design ,market failures ,optimal contracts ,financial intermediation ,optimal regulations - Abstract
This paper studies a Diamond–Dybvig model of providing insurance against unobservable liquidity shocks in the presence of unobservable trades. We show that competitive equilibria are inefficient. A social planner finds it beneficial to introduce a wedge between the interest rate implicit in optimal allocations and the economy's marginal rate of transformation. This improves risk sharing by reducing the attractiveness of joint deviations where agents simultaneously misrepresent their type and engage in trades on private markets. We propose a simple implementation of the optimum that imposes a constraint on the portfolio share that financial intermediaries invest in short-term assets., Economics
- Published
- 2009
- Full Text
- View/download PDF
27. The Energetic Significance of Cooking
- Author
-
Wrangham, Richard W. and Carmody, Rachel Naomi
- Subjects
energy ,cooking ,raw ,starch ,meat ,processing ,Homo ,Lower Paleolithic - Abstract
While cooking has long been argued to improve the diet, the nature of the improvement has not been well defined. As a result, the evolutionary significance of cooking has variously been proposed as being substantial or relatively trivial. In this paper, we evaluate the hypothesis that an important and consistent effect of cooking food is a rise in its net energy value. The pathways by which cooking influences net energy value differ for starch, protein and lipid, and we therefore consider plant and animal foods separately. Evidence of compromised physiological performance among individuals on raw diets supports the hypothesis that cooked diets tend to provide energy. Mechanisms contributing to energy being gained from cooking include increased digestibility of starch and protein, reduced costs of digestion for cooked versus raw meat, and reduced energetic costs of detoxification and defense against pathogens. If cooking indeed consistently improves the energetic value of foods through such mechanisms, its evolutionary impact depends partly on the relative energetic benefits of non-thermal processing methods used prior to cooking. We suggest that if non-thermal processing methods, such as pounding, were used by Lower Paleolithic Homo, they likely provided an important increase in energy gain over unprocessed raw diets. However, cooking has critical effects not easily achievable by non-thermal processing, including the relatively complete gelatinization of starch, efficient denaturing of proteins, and killing of foodborne pathogens. This means that however sophisticated the non-thermal processing methods were, cooking would have conferred incremental energetic benefits. While much remains to be discovered, we conclude that the adoption of cooking would have led to an important rise in energy availability. For this reason, we predict that cooking had substantial evolutionary significance., Anthropology, Human Evolutionary Biology
- Published
- 2009
- Full Text
- View/download PDF
28. The Origin and Content of Expletives: Evidence from 'Selection'
- Author
-
Deal, Amy Rose
- Subjects
there-insertion ,inchoatives ,economy ,agreement ,phase theory - Abstract
While expletive there has primarily been studied in the context of the existential construction, it has long been known that some but not all lexical verbs are compatible with there-insertion. This paper argues that there-insertion can be used to diagnose vPs with no external argument, ruling out transitives, unergatives, and also inchoatives, which are argued to project an event argument on the edge of vP. Based on the tight link between there-insertion and low functional structure, I build a case for low there-insertion, where the expletive is first merged in the specifier of a verbalizing head v. The low merge position is motivated by a stringently local relation that holds between there and its associate DP; this relation plays a crucial role in the interaction of there with raising verbs, where local agreement rules out cases of “too many theres” such as *There seemed there to be a man in the room. An account of these cases in terms of phase theory is explored, ultimately suggesting that there must be merged in a non-thematic phasal specifier position., Linguistics
- Published
- 2009
- Full Text
- View/download PDF
29. Vertical Networks, Integration, and Connectivity
- Author
-
Dogan, Pinar
- Subjects
MBG - Markets ,Business ,and Government ,Vertical Integration ,Interconnection ,Network Externalities - Abstract
This paper studies competition in a network industry with a stylized two layered network structure, and examines: (i) price and connectivity incentives of the upstream networks, and (ii) incentives for vertical integration between an upstream network provider and a downstream firm. The main result of this paper is that vertical integration occurs only if the initial installed-base difference between the upstream networks is sufficiently small, and in that case, industry is configured with two vertically integrated networks, which yields highest incentives to invest in quality of interconnection. When the installed-base difference is sufficiently large, there is no integration in the industry, and neither of the firms have an incentive to invest in quality of interconnection. An industry configuration in which only the large network integrates and excludes (or raises cost of) its downstream rival does not appear as an equilibrium outcome: in the presence of a large asymmetry between the networks, when quality of interconnection is a strategic variable, the large network can exercise a substantial market power without vertical integration. Therefore, a vertically separated industry structure does not necessarily yield procompetitive outcomes.
- Published
- 2009
30. Bayesian change-point analysis for atomic force microscopy and soft material indentation
- Author
-
Rudoy, Daniel, Yuen, Shelten G., Howe, Robert D., and Wolfe, Patrick J.
- Subjects
changepoint detection ,constrained switching regressions ,hierarchical Bayesian models ,indentation testing ,Markov Chain Monte Carlo ,Young's modulus - Abstract
Material indentation studies, in which a probe is brought into controlled physical contact with an experimental sample, have long been a primary means by which scientists characterize the mechanical properties of materials. More recently, the advent of atomic force microscopy, which operates on the same fundamental principle, has in turn revolutionized the nanoscale analysis of soft biomaterials such as cells and tissues. This paper addresses the inferential problems associated with material indentation and atomic force microscopy, through a framework for the changepoint analysis of pre- and post-contact data that is applicable to experiments across a variety of physical scales. A hierarchical Bayesian model is proposed to account for experimentally observed changepoint smoothness constraints and measurement error variability, with efficient Monte Carlo methods developed and employed to realize inference via posterior sampling for parameters such as Young’s modulus, a key quantifier of material stiffness. These results are the first to provide the materials science community with rigorous inference procedures and uncertainty quantification, via optimized and fully automated high-throughput algorithms, implemented as the publicly available software package BayesCP. To demonstrate the consistent accuracy and wide applicability of this approach, results are shown for a variety of data sets from both macro- and micro-materials experiments—including silicone, neurons, and red blood cells—conducted by the authors and others., Engineering and Applied Sciences
- Published
- 2010
- Full Text
- View/download PDF
31. Bayesian Models for Detecting Epistatic Interactions from Genetic Data
- Author
-
Zhang, Yu, Jiang, Bo, Zhu, Jun, and Liu, Jun
- Subjects
Bayesian Methods ,association mapping ,epistasis ,QTL - Abstract
Current disease association studies are routinely conducted on a genome-wide scale, testing hundreds of thousands or millions of genetic markers. Besides detecting marginal associations of individual markers with the disease, it is also of interest to identify gene–gene and gene–environment interactions, which confer susceptibility to the disease risk. The astronomical number of possible combinations of markers and environmental factors, however, makes interaction mapping a daunting task both computationally and statistically. In this paper, we review and discuss a set of Bayesian partition methods developed recently for mapping single-nucleotide polymorphisms in case-control studies, their extension to quantitative traits, and further generalization to multiple traits. We use simulation and real data sets to demonstrate the performance of these methods, and we compare them with some existing interaction mapping algorithms. With the recent advance in high-throughput sequencing technologies, genome-wide measurements of epigenetic factor enrichment, structural variations, and transcription activities become available at the individual level. The tsunami of data creates more challenges for gene–gene interaction mapping, but at the same time provides new opportunities that, if utilized properly through sophisticated statistical means, can improve the power of mapping interactions at the genome scale., Statistics
- Published
- 2010
- Full Text
- View/download PDF
32. Foldable Printed Circuit Boards on Paper Substrates
- Author
-
Siegel, Adam C., Phillips, Scott T., Dickey, Michael D., Lu, Nanshu, Suo, Zhigang, and Whitesides, George McClelland
- Subjects
electro-textiles ,flexible electronics ,paper microfluidics ,polymer circuits ,radio-frequency identification - Abstract
This paper describes several low-cost methods for fabricating flexible electronic circuits on paper. The circuits comprise i) metallic wires (e.g., tin or zinc) that are deposited on the substrate by evaporation, sputtering, or airbrushing, and ii) discrete surface-mountable electronic components that are fastened with conductive adhesive directly to the wires. These electronic circuits—like conventional printed circuit boards—can be produced with electronic components that connect on both sides of the substrate. Unlike printed circuit boards made from fiberglass, ceramics, or polyimides, however, paper can be folded and creased (repeatedly), shaped to form three-dimensional structures, trimmed using scissors, used to wick fluids (e.g., for microfluidic applications) and disposed of by incineration. Paper-based electronic circuits are thin and lightweight; they should be useful for applications in consumer electronics and packaging, for disposable systems for uses in the military and homeland security, for applications in medical sensing or low-cost portable diagnostics, for paper-based microelectromechanical systems, and for applications involving textiles., Chemistry and Chemical Biology, Engineering and Applied Sciences
- Published
- 2010
- Full Text
- View/download PDF
33. How Many Highly Skilled Foreign-Born are Waiting in Line for U.S. Legal Permanent Residence?
- Author
-
Jasso, Guillermina, Wadhwa, Vivek, Gereffi, Gary, Rissing, Ben, and Freeman, Richard Barry
- Abstract
While the United States welcomes foreign-born students and trainees and, less warmly, temporary workers such as H-1B visa holders, it places an array of requirements, obstacles, and delays upon persons who would like to make the U.S. their permanent home. The number of people in the queue for legal permanent residence (LPR) is, however, difficult to ascertain. This paper estimates the number of highly skilled foreign-born persons waiting for LPR via the three main employment-based categories, separately by whether they are living in the United States or abroad, as well as the number of family members. We find that as of the end of FY 2006 there were about half a million employment-based principals awaiting LPR in the United States, together with over half a million family members, plus over 125 thousand principals and family members waiting abroad. These numbers dwarf the visas available annually – 120,120 plus any not used in the family preferences – suggesting that the long delays in gaining legal permanent residence are a visa number problem, not an administrative processing problem, as many believe. The backlog thus cannot be eliminated without a large change in public policy. The delay in gaining legal permanent residence could contribute to the decision of many highly skilled foreign-born to leave the United States., Economics
- Published
- 2010
- Full Text
- View/download PDF
34. Paper-Based ELISA
- Author
-
Cheng, Chao-Min, Martinez, Andres W., Gong, Jinlong, Mace, Charles R., Phillips, Scott T., Carrilho, Emanuel, Mirica, Katherine A., and Whitesides, George McClelland
- Subjects
analytical methods ,antigens ,bioassays ,clinical chemistry ,ELISA - Abstract
Paper works: Paper-based indirect ELISA (see picture) has been demonstrated through the detection of rabbit IgG and the HIV-1 envelope antigen gp41. This technique combines the sensitivity and specificity of ELISA with the low cost and ease-of-use of paper-based platforms., Chemistry and Chemical Biology
- Published
- 2010
- Full Text
- View/download PDF
35. Predictors of breastfeeding cessation among HIV-infected women in Dar es Salaam, Tanzania
- Author
-
Petraro, Paul, Duggan, Christopher Paul, Msamanga, Gernard, Peterson, Karen E., Spiegelman, Donna Lynn, and Fawzi, Wafaie W.
- Subjects
cessation ,breastfeeding ,HIV ,pregnancy ,social support - Abstract
This paper examines predictors of breastfeeding cessation among a cohort of human immunodeficiency virus (HIV)-infected women. This was a prospective follow-up study of HIV-infected women who participated in a randomized micronutrient supplementation trial conducted in Dar es Salaam, Tanzania. 795 HIV-infected Tanzanian women with singleton newborns were utilized from the cohort for this analysis. The proportion of women breastfeeding declined from 95% at 12 months to 11% at 24 months. The multivariate analysis showed breastfeeding cessation was significantly associated with increasing calendar year of delivery from 1995 to 1997 [risk ratio (RR), 1.36; 95% confidence interval (CI) 1.13–1.63], having a new pregnancy (RR 1.33; 95% CI 1.10–1.61), overweight [body mass index (BMI) ≥25 kg m−2; RR 1.37; 95% CI 1.07–1.75], underweight (BMI <18.5 kg m−2; RR 1.29; 95% CI 1.00–1.65), introduction of cow’s milk at infant’s age of 4 months (RR 1.30; 95% CI 1.04–1.63). Material and social support was associated with decreased likelihood of cessation (RR 0.83; 95% CI 0.68–1.02). Demographic, health and nutritional factors among women and infants are associated with decisions by HIV-infected women to cease breastfeeding. The impact of breastfeeding counselling programs for HIV-infected African women should consider individual maternal, social and health contexts.
- Published
- 2010
- Full Text
- View/download PDF
36. Tensor kernels for simultaneous fiber model estimation and tractography
- Author
-
Rathi, Yogesh, Malcolm, James G., Michailovich, Oleg, Westin, Carl-Fredrik, Shenton, Martha Elizabeth, and Bouix, Sylvain
- Subjects
Diffusion-weighted MRI ,tractography ,diffusion tensor estimation - Abstract
This paper proposes a novel framework for joint orientation distribution function (ODF) estimation and tractography based on a new class of tensor-kernels. Existing techniques estimate the local fiber orientation at each voxel independently so there is no running knowledge of confidence in the measured signal or estimated fiber orientation. In this work, fiber tracking is formulated as recursive estimation: at each step of tracing the fiber, the current estimate of the ODF is guided by the previous. To do this, second and higher order tensor based kernels are employed. A weighted mixture of these tensor-kernels is used for representing crossing and branching fiber structures. While tracing a fiber, the parameters of the mixture model are estimated based on the ODF at that location and a smoothness term that penalizes deviation from the previous estimate along the fiber direction. This ensures smooth estimation along the direction of propagation of the fiber. In synthetic experiments, using a mixture of two and three components it is shown that this approach improves the angular resolution at crossings. In vivo experiments using two and three components examine the corpus callosum and corticospinal tract and confirm the ability to trace through regions known to contain such crossing and branching.
- Published
- 2010
- Full Text
- View/download PDF
37. Transistors Formed from a Single Lithography Step Using Information Encoded in Topography
- Author
-
Dickey, Michael D., Russell, Kasey Joe, Lipomi, Darren J., Narayanamurti, Venkatesh, and Whitesides, George McClelland
- Subjects
Fabrication ,Field-Effect Transistors ,Lithography ,Shadow Evaporation ,Surface Chemistry - Abstract
This paper describes a strategy for the fabrication of functional electronic components (transistors, capacitors, resistors, conductors, and logic gates but not, at present, inductors) that combines a single layer of lithography with angle-dependent physical vapor deposition; this approach is named topographically encoded microlithography (abbreviated as TEMIL). This strategy extends the simple concept of ‘shadow evaporation’ to reduce the number and complexity of the steps required to produce isolated devices and arrays of devices, and eliminates the need for registration (the sequential stacking of patterns with correct alignment) entirely. The defining advantage of this strategy is that it extracts information from the 3D topography of features in photoresist, and combines this information with the 3D information from the angle-dependent deposition (the angle and orientation used for deposition from a collimated source of material), to create ‘shadowed’ and ‘illuminated’ regions on the underlying substrate. It also takes advantage of the ability of replica molding techniques to produce 3D topography in polymeric resists. A single layer of patterned resist can thus direct the fabrication of a nearly unlimited number of possible shapes, composed of layers of any materials that can be deposited by vapor deposition. The sequential deposition of various shapes (by changing orientation and material source) makes it possible to fabricate complex structures—including interconnected transistors—using a single layer of topography. The complexity of structures that can be fabricated using simple lithographic features distinguishes this procedure from other techniques based on shadow evaporation., Chemistry and Chemical Biology, Engineering and Applied Sciences
- Published
- 2010
- Full Text
- View/download PDF
38. Ecophysiological Traits of Terrestrial and Aquatic Carnivorous Plants: Are the Costs and Benefits the Same?
- Author
-
Adamec, Lubomír and Ellison, Aaron M.
- Abstract
Identification of trade-offs among physiological and morphological traits and their use in cost-benefit models and ecological or evolutionary optimization arguments have been hallmarks of ecological analysis for at least 50 years. Carnivorous plants are model systems for studying a wide range of ecophysiological and ecological processes and the application of a cost-benefit model for the evolution of carnivory by plants has provided many novel insights into trait-based cost-benefit models. Central to the cost-benefit model for the evolution of botanical carnivory is the relationship between nutrients and photosynthesis; of primary interest is how carnivorous plants efficiently obtain scarce nutrients that are supplied primarily in organic form as prey, digest and mineralize them so that they can be readily used, and allocate them to immediate versus future needs. Most carnivorous plants are terrestrial – they are rooted in sandy or peaty wetland soils – and most studies of cost-benefit trade-offs in carnivorous plants are based on terrestrial carnivorous plants. However approximately 10% of carnivorous plants are unrooted aquatic plants. In this Forum paper, we ask whether the cost-benefit model applies equally well to aquatic carnivorous plants and what general insights into trade-off models are gained by this comparison. Nutrient limitation is more pronounced in terrestrial carnivorous plants, which also have much lower growth rates and much higher ratio of dark respiration to photosynthetic rates than aquatic carnivorous plants. Phylogenetic constraints on ecophysiological trade-offs among carnivorous plants remain unexplored. Despite differences in detail, the general cost-benefit framework continues to be of great utility in understanding the evolutionary ecology of carnivorous plants. We provide a research agenda that if implemented would further our understanding of ecophysiological trade-offs in carnivorous plants and also would provide broader insights into similarities and differences between aquatic and terrestrial plants of all types., Organismic and Evolutionary Biology, Other Research Unit
- Published
- 2011
39. Electromagnetically induced transparency-based slow and stored light in warm atoms
- Author
-
Novikova, Irina, Walsworth, Ronald L., and Xiao, Yanhong
- Subjects
Spin coherence ,electromagnetically induced transparency ,slow light ,stored light ,vapor cells ,warm atoms - Abstract
This paper reviews recent efforts to realize a high-efficiency memory for optical pulses using slow and stored light based on electromagnetically induced transparency (EIT) in ensembles of warm atoms in vapor cells. After a brief summary of basic continuous-wave and dynamic EIT properties, studies using weak classical signal pulses in optically dense coherent media are discussed, including optimization strategies for stored light efficiency and pulse-shape control, and modification of EIT and slow/stored light spectral properties due to atomic motion. Quantum memory demonstrations using both single photons and pulses of squeezed light are then reviewed. Finally a brief comparison with other approaches is presented., Physics
- Published
- 2011
- Full Text
- View/download PDF
40. Information giving and receiving in hematological malignancy consultations
- Author
-
Alexander, Stewart C., Sullivan, Amy Marie, Back, Anthony L., Tulsky, James Aaron, Goldman, Roberta E., Block, Susan Dale, Stewart, Susan K., Wilson-Genderson, Maureen, and Lee, Stephanie J.
- Subjects
physician–patient encounters ,communication ,oncology ,hemotology ,prognosis ,cancer - Abstract
Purpose Little is known about communication with patients suffering from hematologic malignancies, many of whom are seen by subspecialists in consultation at tertiary-care centers. These subspecialized consultations might provide the best examples of optimal physician–patient communication behaviors, given that these consultations tend to be lengthy, to occur between individuals who have not met before and may have no intention of an ongoing relationship, and which have a goal of providing treatment recommendations. The aim of this paper is to describe and quantify the content of the subspecialty consultation in regards to exchanging information and identify patient and provider characteristics associated with discussion elements. Methods Audio-recorded consultations between 236 patients and 40 hematologists were coded for recommended communication practices. Multilevel models for dichotomous outcomes were created to test associations between patient, physician and consultation characteristics and key discussion elements. Results Discussions about the purpose of the visit and patient’s knowledge about their disease were common. Other elements such as patient’s preference for his/her role in decision-making, preferences for information, or understanding of presented information were less common. Treatment recommendations were provided in 97% of the consultations and unambiguous presentations of prognosis occurred in 81% of the consultations. Unambiguous presentations of prognosis were associated with non-White patient race, lower educational status, greater number of questions asked, and specific physician provider. Conclusion Although some communication behaviors occur in most consultations, others are much less common and could help tailor the amount and type of information discussed. Approximately half of the patients are told unambiguous prognostic estimates for mortality or cure.
- Published
- 2011
- Full Text
- View/download PDF
41. Natural Goodness, Rightness, and the Intersubjectivity of Reason: A Reply to Arroyo, Cummisky, Molan, and Bird-Pollan
- Author
-
Korsgaard, Christine M.
- Subjects
aggregation ,Aristotle ,Arroyo ,Bird-Pollan ,consequentialism ,Cummiskey ,Foot ,Geach ,Hegel ,intersubjectivity ,intrinsic value ,Kant ,Moland ,natural goodness - Abstract
In response to Arroyo, I explain my position on the concept of “natural goodness” and how my use of that concept compares to that of Geach and Foot. An Aristotelian or functional notion of goodness provides the material for Kantian endorsement in a theory of value that avoids a metaphysical commitment to intrinsic values. In response to Cummiskey, I review reasons for thinking Kantianism and consequentialism incompatible, especially those objections to aggregation that arise from the notion of the natural good previously described. In response to Moland, I explain why I think Hegelian worries about the supposed emptiness of the Kantian self do not apply to my account. And in response to both Moland and Bird-Pollan, I argue that, contrary to the view of some Hegelians, the intersubjective normativity of reason is not something developed through actual social relations; rather, it is something essential to an individual's relations with himself or herself. I want to begin by thanking Christopher Arroyo, David Cummiskey, Lydia Moland, and Stefan Bird-Pollan for their interesting and provocative comments in this symposium. There's more in their papers than I can possibly respond to in a reasonable space, so I'm just going to pick and choose. “The Origin of the Good and Our Animal Nature” spells out some of my current thinking on the good, so a summary of that paper will put me in a position to begin by addressing some of Arroyo's and Cummiskey's points., Philosophy
- Published
- 2011
- Full Text
- View/download PDF
42. Opportunity Mars Rover mission: Overview and selected results from Purgatory ripple to traverses to Endeavour crater
- Author
-
Arvidson, R. E., Ashley, J. W., Bell, J. F., Chojnacki, M., Cohen, J., Economou, T. E., Farrand, W. H., Fergason, R., Fleischer, I., Geissler, P., Gellert, R., Golombek, M. P., Grotzinger, J. P., Guinness, E. A., Haberle, R. M., Herkenhoff, K. E., Herman, J. A., Iagnemma, K. D., Jolliff, B. L., Johnson, J. R., Klingelhöfer, G., Knoll, Andrew Herbert, Knudson, A. T., Li, R., McLennan, S. M., Mittlefehldt, D. W., Morris, R. V., Parker, T. J., Rice, M. S., Schröder, C., Soderblom, L. A., Squyres, S. W., Sullivan, R. J., and Wolff, M. J.
- Subjects
Mars Exploration Rover ,MER ,Opportunity - Abstract
[1] Opportunity has been traversing the Meridiani plains since 25 January 2004 (sol 1), acquiring numerous observations of the atmosphere, soils, and rocks. This paper provides an overview of key discoveries between sols 511 and 2300, complementing earlier papers covering results from the initial phases of the mission. Key new results include (1) atmospheric argon measurements that demonstrate the importance of atmospheric transport to and from the winter carbon dioxide polar ice caps; (2) observations showing that aeolian ripples covering the plains were generated by easterly winds during an epoch with enhanced Hadley cell circulation; (3) the discovery and characterization of cobbles and boulders that include iron and stony-iron meteorites and Martian impact ejecta; (4) measurements of wall rock strata within Erebus and Victoria craters that provide compelling evidence of formation by aeolian sand deposition, with local reworking within ephemeral lakes; (5) determination that the stratigraphy exposed in the walls of Victoria and Endurance craters show an enrichment of chlorine and depletion of magnesium and sulfur with increasing depth. This result implies that regional-scale aqueous alteration took place before formation of these craters. Most recently, Opportunity has been traversing toward the ancient Endeavour crater. Orbital data show that clay minerals are exposed on its rim. Hydrated sulfate minerals are exposed in plains rocks adjacent to the rim, unlike the surfaces of plains outcrops observed thus far by Opportunity. With continued mechanical health, Opportunity will reach terrains on and around Endeavour's rim that will be markedly different from anything examined to date., Earth and Planetary Sciences
- Published
- 2011
- Full Text
- View/download PDF
43. Establishing criteria for higher-level classification using molecular data: the systematics ofPolyommatusblue butterflies (Lepidoptera, Lycaenidae)
- Author
-
Talavera, Gerard, Lukhtanov, Vladimir A., Pierce, Naomi Ellen, and Vila, Roger
- Abstract
Most taxonomists agree on the need to adapt current classifications to recognize monophyletic units. However, delineations between higher taxonomic units can be based on the relative ages of different lineages and/or the level of morphological differentiation. In this paper, we address these issues in considering the species-rich Polyommatus section, a group of butterflies whose taxonomy has been highly controversial. We propose a taxonomy-friendly, flexible temporal scheme for higher-level classification. Using molecular data from nine markers (6666 bp) for 104 representatives of the Polyommatus section, representing all but two of the 81 described genera/subgenera and five outgroups, we obtained a complete and well resolved phylogeny for this clade. We use this to revise the systematics of the Polyommatus blues, and to define criteria that best accommodate the described genera within a phylogenetic framework. First, we normalize the concept of section (Polyommatus) and propose the use of subtribe (Polyommatina) instead. To preserve taxonomic stability and traditionally recognized taxa, we designate an age interval (4–5 Myr) instead of a fixed minimum age to define genera. The application of these criteria results in the retention of 31 genera of the 81 formally described generic names, and necessitates the description of one new genus (Rueckbeilia gen. nov.). We note that while classifications should be based on phylogenetic data, applying a rigid universal scheme is rarely feasible. Ideally, taxon age limits should be applied according to the particularities and pre-existing taxonomy of each group. We demonstrate that the concept of a morphological gap may be misleading at the genus level and can produce polyphyletic genera, and we propose that recognition of the existence of cryptic genera may be useful in taxonomy., Organismic and Evolutionary Biology
- Published
- 2012
- Full Text
- View/download PDF
44. The growing role of web-based geospatial technology in disaster response and support
- Author
-
Kawasaki, Akiyuki, Berman, Merrick Lex, and Guan, Wendy
- Subjects
API (application programming interface) ,crowd-sourcing ,geospatial portal ,GIS (geographic information systems) ,Haiti earthquake ,mash-ups ,public involvement ,Sichuan earthquake ,web mapping - Abstract
This paper examines changes in disaster response and relief efforts and recent web-based geospatial technological developments through an evaluation of the experiences of the Center for Geographic Analysis, Harvard University, of the Sichuan (2008) and Haiti (2010) earthquake responses. This paper outlines how conventional GIS (geographic information systems) disaster responses by governmental agencies and relief response organisations and the means for geospatial data-sharing have been transformed into a more dynamic, more transparent, and decentralised form with a wide participation. It begins by reviewing briefly at historical changes in the employment of geospatial technologies in major devastating disasters, including the Sichuan and Haiti earthquakes (case studies for our geospatial portal project). It goes on to assess changes in the available dataset type and in geospatial disaster responders, as well as the impact of geospatial technological changes on disaster relief effort. Finally, the paper discusses lessons learned from recent responses and offers some thoughts for future development., Other Research Unit
- Published
- 2012
- Full Text
- View/download PDF
45. The Property Clause Question
- Author
-
Michelman, Frank Isaac
- Abstract
A “property clause” is a dedicated text in the written basic law of a constitutional-democratic state, addressing the question of the security of asset-holdings (and of their values to their owners) against impairment by action or allowance of the state. The clause provides a defensive guarantee against such impairments, in the form of a trumping right of every person to be protected – perhaps not absolutely and unconditionally, but not negligibly, either – against state-engineered losses in lawfully established asset-holdings or asset-values. How should someone writing a constitution for an expectantly “social liberal” state regime think about the question of a property clause? Without suggesting that there can be any one-size-fits-all sort of answer to the question of including such a clause or not, this paper confines itself to doubting sharply one sort of a reason our constitution-writers might consider for including one – namely, that a liberal constitutional bill of rights ought to contain clauses covering all classes of interests of persons that qualify in liberalism as basic rights and freedoms and the interest distinctively protected by a property clause does so qualify – and suggesting some pros and cons regarding a quite different sort of reason for inclusion that the writers will also undoubtedly ponder – namely, that the clause will serve to keep lawmakers and constitutional adjudicators properly attuned to a national foundational commitment to a system of political economy in which markets play a key role. This essay, prepared as an after-dinner talk for the Conference on Constitutional Revolutions and Counter-Revolutions held at the New School for Social Research, May 5-7, 2011, is a companion to my “Liberal Constitutionalism, Property Rights, and the Assault on Poverty,” Stellenbosch Law Review (2012) (forthcoming), which treats more expansively some points made summarily here. A version of this essay will appear in Constellations 12 (2012).
- Published
- 2012
- Full Text
- View/download PDF
46. Analyzing Forensic Evidence Based on Density with Magnetic Levitation
- Author
-
Lockett, Matthew, Mirica, Katherine A., Mace, Charles R., Blackledge, Robert D., and Whitesides, George M.
- Subjects
forensic science ,density-based measurements ,analysis of contact traces ,magnetic levitation (MagLev) ,quantitative analytical method ,glitter ,gunpowder - Abstract
This paper describes a method for determining the density of contact trace objects with magnetic levitation (MagLev). MagLev measurements accurately determine the density (±0.0002 g/cm3) of a diamagnetic object and are compatible with objects that are nonuniform in shape and size. The MagLev device (composed of two permanent magnets with like poles facing) and the method described provide a means of accurately determining the density of trace objects. This method is inexpensive, rapid, and verifiable and provides numerical values—independent of the specific apparatus or analyst—that correspond to the absolute density of the sample that may be entered into a searchable database. We discuss the feasibility of MagLev as a possible means of characterizing forensic-related evidence and demonstrate the ability of MagLev to (i) determine the density of samples of glitter and gunpowder, (ii) separate glitter particles of different densities, and (iii) determine the density of a glitter sample that was removed from a complex sample matrix., Chemistry and Chemical Biology
- Published
- 2013
- Full Text
- View/download PDF
47. Business Model Innovation and Competitive Imitation: The Case of Sponsor-Based Business Models
- Author
-
Casadesus-Masanell, Ramon and Zhu, Feng
- Subjects
business model ,innovation and invention ,price ,competitive strategy ,adoption ,value ,duopoly and oligopoly ,product ,customers ,market entry and exit ,monopoly - Abstract
This paper provides the first formal model of business model innovation. Our analysis focuses on sponsor-based business model innovations where a firm monetizes its product through sponsors rather than setting prices to its customer base. We analyze strategic interactions between an innovative entrant and an incumbent where the incumbent may imitate the entrant's business model innovation once it is revealed. The results suggest that an entrant needs to strategically choose whether to reveal its innovation by competing through the new business model or conceal it by adopting a traditional business model. We also show that the value of business model innovation may be so substantial that an incumbent may prefer to compete in a duopoly rather than to remain a monopolist.
- Published
- 2013
- Full Text
- View/download PDF
48. Field procedures in the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS)
- Author
-
Heeringa, Steven G., Gebler, Nancy, Colpe, Lisa J., Fullerton, Carol S., Hwang, Irving, Kessler, Ronald, Naifeh, James A., Nock, Matthew K., Sampson, Nancy A., Schoenbaum, Michael, Zaslavsky, Alan M., Stein, Murray B., and Ursano, Robert J.
- Subjects
Suicide ,mental disorders ,U.S. Army ,epidemiologic research design ,design effects ,sample bias ,sample weights ,survey design efficiency ,survey sampling - Abstract
The Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS) is a multi-component epidemiological and neurobiological study of unprecedented size and complexity designed to generate actionable evidence-based recommendations to reduce U.S. Army suicides and increase basic knowledge about determinants of suicidality by carrying out coordinated component studies. A number of major logistical challenges were faced in implementing these studies. The current report presents an overview of the approaches taken to meet these challenges, with a special focus on the field procedures used to implement the component studies. As detailed in the paper, these challenges were addressed at the onset of the initiative by establishing an Executive Committee, a Data Coordination Center (the Survey Research Center [SRC] at the University of Michigan), and study-specific design and analysis teams that worked with staff on instrumentation and field procedures. SRC staff, in turn, worked with the Office of the Deputy Under Secretary of the Army (ODUSA) and local Army Points of Contact (POCs) to address logistical issues and facilitate data collection. These structures, coupled with careful fieldworker training, supervision, and piloting contributed to the major Army STARRS data collection efforts having higher response rates than previous large-scale studies of comparable military samples., Psychology
- Published
- 2013
- Full Text
- View/download PDF
49. Magnetic Assembly of Soft Robots with Hard Components
- Author
-
Kwok, Sen W., Morin, Stephen A., Mosadegh, Bobak, So, Ju-Hee, Shepherd, Robert F., Martinez, Ramses, Smith, Barbara L, Simeone, Felice, Stokes, Adam A., and Whitesides, George McClelland
- Subjects
soft robots ,soft machines ,magnetic assembly ,hybrid ,reconfigurable - Abstract
This paper describes the modular magnetic assembly of reconfigurable, pneumatically actuated robots composed of soft and hard components and materials. The soft components of these hybrid robots are actuators fabricated from silicone elastomers using soft lithography,and the hard components are acrylonitrile-butadiene-styrene (ABS) structures made using three-dimensional (3D) printing. Neodymium-iron-boron (NdFeB) ring magnets are embedded in these components to make and maintain the connections between components. The reversibility of these magnetic connections allows the rapid reconfiguration of these robots using components made of different materials (soft and hard) that also have different sizes, structures, and functions; in addition, it accelerates the testing of new designs, the exploration of new capabilities, and the repair or replacement of damaged parts. This method of assembling soft actuators to build soft machines addresses some limitations associated with using soft lithography for the direct molding of complex 3D pneumatic networks. Combining the self-aligning property of magnets with pneumatic control makes it possible for a teleoperator to modify the structures and capabilities of these robots readily in response to the requirements of different tasks., Physics
- Published
- 2013
50. Next-Generation QTL Mapping: Crowdsourcing SNPs, Without Pedigrees
- Author
-
Edwards, Scott V.
- Subjects
life history traits ,natural population ,pedigree ,QTL mapping ,quantitative traits - Abstract
For many molecular ecologists, the mantra and mission of the field of ecological genomics could be encapsulated by the phrase ‘to find the genes that matter’ (Mitchell-Olds 2001; Rockman 2012). This phrase of course refers to the early hope and current increasing success in the search for genes whose variation underlies phenotypic variation and fitness in natural populations. In the years since the modern incarnation of the field of ecological genomics, many would agree that the low-hanging fruit has, at least in principle, been plucked: we now have several elegant examples of genes whose variation influences key adaptive traits in natural populations, and these examples have revealed important insights into the architecture of adaptive variation (Hoekstra et al. 2006; Shapiro et al. 2009; Chan et al. 2010). But how well will these early examples, often involving single genes of large effect on discrete or near-discrete phenotypes, represent the dynamics of adaptive change for the totality of phenotypes in nature? Will traits exhibiting continuous rather than discrete variation in natural populations have as simple a genetic basis as these early examples suggest (Prasad et al. 2012; Rockman 2012)? Two papers in this issue (Robinson et al. 2013; Santure et al. 2013) not only suggest answers to these questions but also provide useful extensions of statistical approaches for ecological geneticists to study the genetics of continuous variation in nature. Together these papers, by the same research groups studying evolution in a natural population of Great Tits (Parus major), provide a glimpse of what we should expect as the field begins to dissect the genetic basis of what is arguably the most common type of variation in nature, and how genome-wide surveys of variation can be applied to natural populations without pedigrees., Organismic and Evolutionary Biology
- Published
- 2013
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.