100 results on '"Curtis, Andrew"'
Search Results
2. A Single Motor Nano Aerial Vehicle with Novel Peer-to-Peer Communication and Sensing Mechanism
- Author
-
Wang, Jingxian, Curtis, Andrew G., Yim, Mark, Rubenstein, Michael, Wang, Jingxian, Curtis, Andrew G., Yim, Mark, and Rubenstein, Michael
- Abstract
Communication and position sensing are among the most important capabilities for swarm robots to interact with their peers and perform tasks collaboratively. However, the hardware required to facilitate communication and position sensing is often too complicated, expensive, and bulky to be carried on swarm robots. Here we present Maneuverable Piccolissimo 3 (MP3), a minimalist, single motor drone capable of executing inter-robot communication via infrared light and triangulation-based sensing of relative bearing, distance, and elevation using message arrival time. Thanks to its novel design, MP3 can communicate with peers and localize itself using simple components, keeping its size and mass small and making it inherently safe for human interaction. We present the hardware and software design of MP3 and demonstrate its capability to localize itself, fly stably, and maneuver in the environment using peer-to-peer communication and sensing.
- Published
- 2024
3. Continuous Sculpting: Persistent Swarm Shape Formation Adaptable to Local Environmental Changes
- Author
-
Curtis, Andrew G., Yim, Mark, Rubenstein, Michael, Curtis, Andrew G., Yim, Mark, and Rubenstein, Michael
- Abstract
Despite their growing popularity, swarms of robots remain limited by the operating time of each individual. We present algorithms which allow a human to sculpt a swarm of robots into a shape that persists in space perpetually, independent of onboard energy constraints such as batteries. Robots generate a path through a shape such that robots cycle in and out of the shape. Robots inside the shape react to human initiated changes and adapt the path through the shape accordingly. Robots outside the shape recharge and return to the shape so that the shape can persist indefinitely. The presented algorithms communicate shape changes throughout the swarm using message passing and robot motion. These algorithms enable the swarm to persist through any arbitrary changes to the shape. We describe these algorithms in detail and present their performance in simulation and on a swarm of mobile robots. The result is a swarm behavior more suitable for extended duration, dynamic shape-based tasks in applications such as agriculture and emergency response., Comment: 20 pages, 17 figures
- Published
- 2024
4. Two families of holomorphic correspondences
- Author
-
Curtis, Andrew
- Subjects
515 ,Holomorphic correspondences ,covering correspondence ,Cantor set correspondences ,Klein Combination Theorem ,matings - Abstract
Holomorphic correspondences are multivalued functions from the Riemann sphere to itself. This thesis is concerned with a certain type of holomorphic correspondence known as a covering correspondence. In particular we are concerned with a one complexdimensional family of correspondences constructed by post-composing a covering correspondence with a conformal involution. Correspondences constructed in this manner have varied and intricate dynamics. We introduce and analyze two subfamilies of this parameter space. The first family consists of correspondences for which the limit set is a Cantor set, the second family consists of correspondences for which the limit set is connected and for which the action of the correspondence on the complement of this limit set exhibits certain group like behaviour.
- Published
- 2014
5. The influence of ‘nanocluster’ reinforcement on the mechanical properties of a resin-based composite material
- Author
-
Curtis, Andrew R.
- Subjects
617.6 ,RK Dentistry - Abstract
The introduction of innovative filled methacrylate resin composites has revolutionised the field of aesthetic restorative dentistry and provided a clinically viable alternative to amalgam-based restorations. The mechano-physical properties and resultant clinical longevity of these materials was insufficient. To improve these properties the on-going development of resin-based composites (RBCs) has sought to modify the filler size and morphology and to improve the loading and distribution of constituent filler particles. This has resulted in the introduction of so-called ‘nanofills’ which possess a combination of nano- and micro-sized filler to produce a hybrid material. A variation to this approach was the introduction of ‘nanocluster’ particles, which are essentially an agglomeration of nano-sized silica and zirconia particles. Although these materials have demonstrated a degree of clinical and experimental success debate remains as to their specific benefit compared with existing conventionally filled systems. Following placement RBC restorations are exposed to masticatory loading (repeated sub-critical stresses) which are typically detrimental to the clinical longevity of the material. The current study determined that RBCs reinforced with the ‘nanocluster’ particles possessed statistically similar or significantly increased bi-axial flexure strengths and associated Weibull moduli following pre-loading regimes which produced catastrophic failure of conventionally filled RBCs. This was attributed to the unique reinforcement provided by the ‘nanocluster’ particle, which were identified by a novel micromanipulation technique to possess distinctive fracture mechanisms, in addition to possessing an IPC-like structure. These acted in combination to absorb and dissipate loading stresses and to provide enhanced damage tolerance. Near-infra-red spectroscopy was also employed to determine the water sorption and it did not identify any direct correlation between water content and extent of strength reduction. However, immersion of the materials in water and also in sodium hydroxide or ethanol highlighted that the long-term hydrolytic stability of the ‘nanoclusters’ was limited. This suggested that degradation of the interfacial silane layer weakened the ‘nanocluster’ particle causing them to act as defect centres within the resin matrix and to consequently generate a greater loss of strength. Therefore, whilst the ‘nanocluster’ reinforced RBCs have the potential to provide enhanced damage tolerance and improved clinical longevity the limited long-term hydrolytic stability suggests further development of hydrophilic silane coupling agents and resin monomers is required to realize these properties.
- Published
- 2009
6. Understanding social capital : are the problems inherent in Putnam's concept intractable?
- Author
-
Curtis, Andrew
- Subjects
306.01 - Abstract
Putnam's version of social capital, and the main problems with it, are outlined in Chapter One of this thesis. In Chapters Two and Three alternative conceptual approaches are examined to see whether they might resolve any of the difficulties in Putnam's work. The six problems arising from Putnam's work identified in Chapter One are: 1) the lack of a developed conceptual framework; 2) whether macro-level analysis is appropriate; 3) how the concept fits with considerations of structure and agency; 4) whether the negative aspects of social capital are fully taken into account; 5) what the relative merits of "bonding" and "bridging" social capital are; and; 6) whether social capital is only ever a by-product of other activities or can also be consciously created. In Chapter Two, Coleman and Ostrom's separate work in social capital is analysed. They use the concept as part of an attempt to add broader social considerations to theories of rational and collective action. In Chapter Three, the main authors examined are Bourdieu and Lin. Bourdieu uses social capital to complement his concept of cultural capital in looking at the reproduction of inequality. Lin develops a theory of social capital that focuses on individuals' action in pursuing resources in networks. It emerges that the other authors can contribute various elements that help to address some of the problems in Putnam's work. Yet the most appropriate level of analysis and the full implications of bridging social capital remain points of contention. In Chapter four the future of Putnam's use of social capital is debated and it is concluded that he will have to abandon his macro-level analysis if the full conceptual intricacies of social capital are to be realised.
- Published
- 2007
7. Re-reading the Gospel of Luke today : from a first century urban writing site to a twentieth century urban reading site
- Author
-
Curtis, Andrew John
- Subjects
800 ,Biblical interpretation - Abstract
Postmodern theorising has presented the reader as an active agent in the process of the interpretation of texts. Sociology of knowledge approaches have identified both the author and the reader of texts as socially embodied within a context. This study presents a unique collection of readings in the Gospel of Luke by ordinary real-readers from a disadvantaged and/or marginalised social and ecclesial location, within an affluent first world context. These readings, transcribed in Volume Two, present empirical reader research for analysis, through dialogue and conversation with professional readings in the Gospel of Luke, in order to assess what contribution the former might make to contemporary hermeneutics. Identifying contemporary human experience of ordinary real-readers as the starting point in their reading of the Lukan text, the study illustrates how these readings act as a useful tool of suspicion in conversation with readings that claim to be objective and value-neutral, and how they facilitate critical reflection on the ideological and theological commitments of the dominant classes in society and church. The value and legitimacy of the readings of ordinary real-readers is discussed, and how their social and ecclesial marginalisation and disadvantage provides a nontotalising presence in biblical interpretation, a presence that guards against the claims of permanence made by those in the academic and ecclesial world. Identification of contemporary human experience as inevitably influencing the process of interpretation leads to a consideration of the place of the historical critical paradigm in biblical studies. The value and legitimacy of ordinary real readers as active agents in the process of interpretation, and the contribution they make to contemporary hermeneutics, requires a consideration of safeguards against reading anarchy. The process of self and social analysis, and an openness to dialogue and conversation with those outside our own contexts, including our ancestors in the faith, is considered as a way forward, utilising ordinary and professional real-readers in the ongoing process of biblical interpretation.
- Published
- 1999
- Full Text
- View/download PDF
8. A Framework for Assessing Energy Exporting Countries' Vulnerability and Energy Security: Current Fossil Fuel-Dependent Economy and Future Hydrogen Economy
- Author
-
Curtis, Andrew John Bathgate and Curtis, Andrew John Bathgate
- Published
- 2023
9. Framework for Assessment of the Economic Vulnerability of Energy-Resource-Exporting Countries
- Author
-
Curtis, Andrew, McLellan, Benjamin, Curtis, Andrew, and McLellan, Benjamin
- Abstract
Energy security is widely examined from the perspective of energy import vulnerability, but it is less common to evaluate the vulnerability of energy exporters. This paper presents an assessment framework and quantitative scorecard for evaluating the economic vulnerability of countries with significant energy exports. The background research of various related conceptual frameworks distils useful insights from energy security, corporate risks, and general economic vulnerability. Carbon exposure, largely missing from related work, is introduced to the study in new factors to evaluate exporter vulnerability to increasing global action on climate change. A holistic view is taken of all energy resource exports as a novel approach, rather than focusing on individual fuels. The developed scorecard is used to provide case studies of five major global energy exporters with comparative analysis between countries and over time.
- Published
- 2023
10. Framework for Assessment of the Economic Vulnerability of Energy-Resource-Exporting Countries
- Author
-
10723455, Curtis, Andrew, McLellan, Benjamin, 10723455, Curtis, Andrew, and McLellan, Benjamin
- Abstract
Energy security is widely examined from the perspective of energy import vulnerability, but it is less common to evaluate the vulnerability of energy exporters. This paper presents an assessment framework and quantitative scorecard for evaluating the economic vulnerability of countries with significant energy exports. The background research of various related conceptual frameworks distils useful insights from energy security, corporate risks, and general economic vulnerability. Carbon exposure, largely missing from related work, is introduced to the study in new factors to evaluate exporter vulnerability to increasing global action on climate change. A holistic view is taken of all energy resource exports as a novel approach, rather than focusing on individual fuels. The developed scorecard is used to provide case studies of five major global energy exporters with comparative analysis between countries and over time.
- Published
- 2023
11. Language, learning and support : overseas students at a British university
- Author
-
Curtis, Andrew
- Subjects
370 ,Education & training - Published
- 1995
12. Shear wave studies and elastic models of extensional zones : the Tibetian Plateau and Aegean region
- Author
-
Curtis, Andrew
- Subjects
551.21 ,Volcanology & plate tectonics - Published
- 1994
13. Modular interface for managing cognitive bias in experts
- Author
-
Whitehead, Melody G, Curtis, Andrew, Whitehead, Melody G, and Curtis, Andrew
- Abstract
Expert knowledge is required to interpret data across a range of fields. Experts bridge gaps that often exists in our knowledge about relationships between data and the parameters of interest. This is especially true in geoscientific applications, where knowledge of the Earth is derived from interpretations of observable features and relies on predominantly unproven but widely accepted theories. Thus, experts facilitate solutions to otherwise unsolvable problems. However, experts are inherently subjective, and susceptible to cognitive biases and adverse external effects. This work examines this problem within geoscience. Three compelling examples are provided of the prevalence of cognitive biases from previous work. The problem is then formally defined, and a set of design principles which ensure that any solution is sufficiently flexible to be readily applied to the range of geoscientific problems. No solutions exist that reliably capture and reduce cognitive bias in experts. However, formal expert elicitation methods can be used to assess expert variation, and a variety of approaches exist that may help to illuminate uncertainties, avoid misunderstandings, and reduce herding behaviours or single-expert over-dominance. This work combines existing and future approaches to reduce expert suboptimality through a flexible modular design where each module provides a specific function. The design centres around action modules that force a stop-and-perform step into interpretation tasks. A starter-pack of modules is provided as an example of the conceptual design. This simple bias-reduction system may readily be applied in organisations and during everyday interpretations through to tasks for major commercial ventures.
- Published
- 2022
14. Accounting for natural uncertainty within monitoring systems for induced seismicity based on earthquake magnitudes
- Author
-
Roy, Corinna, Nowacki, Andy, Zhang, Xin, Curtis, Andrew, Baptie, Brian, Roy, Corinna, Nowacki, Andy, Zhang, Xin, Curtis, Andrew, and Baptie, Brian
- Abstract
To reduce the probability of future large earthquakes, traffic light systems (TLSs) define appropriate reactions to observed induced seismicity depending on each event's range of local earthquake magnitude (ML). The impact of velocity uncertainties and station site effects may be greater than a whole magnitude unit of ML, which can make the difference between a decision to continue (“green” TLS zone) and an immediate stop of operations (“red” zone). We show how to include these uncertainties in thresholds such that events only exceed a threshold with a fixed probability. This probability can be set by regulators to reflect their tolerance to risk. We demonstrate that with the new TLS, a red-light threshold would have been encountered earlier in the hydraulic fracturing operation at Preston New Road, UK, halting operations and potentially avoiding the later large magnitude events. It is therefore critical to establish systems which permit regulators to account for uncertainties when managing risk.
- Published
- 2021
15. Principles for Evaluation of AI/ML Model Performance and Robustness
- Author
-
Brown, Olivia, Curtis, Andrew, Goodwin, Justin, Brown, Olivia, Curtis, Andrew, and Goodwin, Justin
- Abstract
The Department of Defense (DoD) has significantly increased its investment in the design, evaluation, and deployment of Artificial Intelligence and Machine Learning (AI/ML) capabilities to address national security needs. While there are numerous AI/ML successes in the academic and commercial sectors, many of these systems have also been shown to be brittle and nonrobust. In a complex and ever-changing national security environment, it is vital that the DoD establish a sound and methodical process to evaluate the performance and robustness of AI/ML models before these new capabilities are deployed to the field. This paper reviews the AI/ML development process, highlights common best practices for AI/ML model evaluation, and makes recommendations to DoD evaluators to ensure the deployment of robust AI/ML capabilities for national security needs.
- Published
- 2021
16. Imaging the subsurface using induced seismicity and ambient noise: 3-D tomographic Monte Carlo joint inversion of earthquake body wave traveltimes and surface wave dispersion
- Author
-
Zhang, Xin, Roy, Corinna, Curtis, Andrew, Nowacki, Andy, Baptie, Brian, Zhang, Xin, Roy, Corinna, Curtis, Andrew, Nowacki, Andy, and Baptie, Brian
- Abstract
Seismic body wave traveltime tomography and surface wave dispersion tomography have been used widely to characterize earthquakes and to study the subsurface structure of the Earth. Since these types of problem are often significantly non-linear and have non-unique solutions, Markov chain Monte Carlo methods have been used to find probabilistic solutions. Body and surface wave data are usually inverted separately to produce independent velocity models. However, body wave tomography is generally sensitive to structure around the subvolume in which earthquakes occur and produces limited resolution in the shallower Earth, whereas surface wave tomography is often sensitive to shallower structure. To better estimate subsurface properties, we therefore jointly invert for the seismic velocity structure and earthquake locations using body and surface wave data simultaneously. We apply the new joint inversion method to a mining site in the United Kingdom at which induced seismicity occurred and was recorded on a small local network of stations, and where ambient noise recordings are available from the same stations. The ambient noise is processed to obtain inter-receiver surface wave dispersion measurements which are inverted jointly with body wave arrival times from local earthquakes. The results show that by using both types of data, the earthquake source parameters and the velocity structure can be better constrained than in independent inversions. To further understand and interpret the results, we conduct synthetic tests to compare the results from body wave inversion and joint inversion. The results show that trade-offs between source parameters and velocities appear to bias results if only body wave data are used, but this issue is largely resolved by using the joint inversion method. Thus the use of ambient seismic noise and our fully non-linear inversion provides a valuable, improved method to image the subsurface velocity and seismicity.
- Published
- 2020
17. Identifying multiply scattered wavepaths in strongly scattering and dispersive media
- Author
-
Masfara, L.O.M. (author), Curtis, Andrew (author), Thomsen, Henrik Rasmus (author), Van Manen, Dirk Jan (author), Masfara, L.O.M. (author), Curtis, Andrew (author), Thomsen, Henrik Rasmus (author), and Van Manen, Dirk Jan (author)
- Abstract
The ability to extract information from scattered waves is usually limited to singly scattered energy even if multiple scattering might occur in the medium. As a result, the information in arrival times of higher-order scattered events is underexplored. This information is extracted using fingerprinting theory. This theory has never previously been applied successfully to real measurements, particularly when the medium is dispersive. The theory is used to estimate the arrival times and scattering paths of multiply scattered waves in a thin sheet using an automated scheme in a dispersive medium by applying an additional dispersion compensation method. Estimated times and paths are compared with predictions based on a sequence of straight ray paths for each scattering event given the known scatterer locations. Additionally, numerical modelling is performed to verify the interpretations of the compensated data. Since the source also acts as a scatterer in these experiments, initially, the predictions and the numerical results did not conform to the experimental observations. By reformulating the theory and the processing scheme and adding a source scatterer in the modelling, it is shown that predictions of all observed scattering events are possible with both prediction methods, verifying that the methods are both effective and practically achievable., Applied Geophysics and Petrophysics
- Published
- 2020
- Full Text
- View/download PDF
18. Aspects of the chemistry of some colloidal metals
- Author
-
Curtis, Andrew Crawford
- Subjects
541 ,Physical chemistry - Published
- 1988
19. Automated LULC Map Production using Deep Neural Networks
- Author
-
Henry, Christopher J., Storie, Christopher, Palaniappan, Muthu, Alhassan, Victor, Swamy, Mallikarjun, Aleshinloye, Damilola, Curtis, Andrew, Kima, Daeyoun, Henry, Christopher J., Storie, Christopher, Palaniappan, Muthu, Alhassan, Victor, Swamy, Mallikarjun, Aleshinloye, Damilola, Curtis, Andrew, and Kima, Daeyoun
- Abstract
This article presents an approach to automating the creation of land-use/land-cover classification (LULC) maps from satellite images using deep neural networks that were developed to perform semantic segmentation of natural images. This work is important since the production of accurate and timely LULC maps is becoming essential to government and private companies that rely on them for large-scale monitoring of land resource changes. In this work, deep neural networks are trained to classify each pixel of a satellite image into one of a number of LULC classes. The presented deep neural networks are all pre-trained using the ImageNet Large-Scale Visual Recognition Competition (ILSVRC) datasets and then fine-tuned using approximately 19,000 Landsat 5/7 satellite images of resolution 224×224 taken of the Province of Manitoba in Canada. The result is an automated solution that can produce LULC maps images significantly faster than current semi-automated methods. The contributions of this article are the observation that deep neural networks developed for semantic segmentation can be used to automate the task of producing LULC maps; the use of these networks to produce LULC maps; and a comparison of several popular semantic segmentation architectures for solving the problem of automated LULC map production.
- Published
- 2019
20. The future of passive seismic acquisition
- Author
-
Hammond, James O.S., England, Richard, Rawlinson, Nick, Curtis, Andrew, Sigloch, Karin, Harmon, Nick, Baptie, Brian, Hammond, James O.S., England, Richard, Rawlinson, Nick, Curtis, Andrew, Sigloch, Karin, Harmon, Nick, and Baptie, Brian
- Abstract
James Hammond and co-authors report from a BGA meeting on how advances in instrumentation are opening up opportunities for dense, large-scale deployments of seismometers on land and in the oceans.
- Published
- 2019
21. Spatial video geonarratives and health: case studies in post-disaster recovery, crime, mosquito control and tuberculosis in the homeless.
- Author
-
Curtis, Andrew, Curtis, Andrew, Curtis, Jacqueline W, Shook, Eric, Smith, Steve, Jefferis, Eric, Porter, Lauren, Schuch, Laura, Felix, Chaz, Kerndt, Peter R, Curtis, Andrew, Curtis, Andrew, Curtis, Jacqueline W, Shook, Eric, Smith, Steve, Jefferis, Eric, Porter, Lauren, Schuch, Laura, Felix, Chaz, and Kerndt, Peter R
- Abstract
BackgroundA call has recently been made by the public health and medical communities to understand the neighborhood context of a patient's life in order to improve education and treatment. To do this, methods are required that can collect "contextual" characteristics while complementing the spatial analysis of more traditional data. This also needs to happen within a standardized, transferable, easy-to-implement framework.MethodsThe Spatial Video Geonarrative (SVG) is an environmentally-cued narrative where place is used to stimulate discussion about fine-scale geographic characteristics of an area and the context of their occurrence. It is a simple yet powerful approach to enable collection and spatial analysis of expert and resident health-related perceptions and experiences of places. Participants comment about where they live or work while guiding a driver through the area. Four GPS-enabled cameras are attached to the vehicle to capture the places that are observed and discussed by the participant. Audio recording of this narrative is linked to the video via time stamp. A program (G-Code) is then used to geotag each word as a point in a geographic information system (GIS). Querying and density analysis can then be performed on the narrative text to identify spatial patterns within one narrative or across multiple narratives. This approach is illustrated using case studies on post-disaster psychopathology, crime, mosquito control, and TB in homeless populations.ResultsSVG can be used to map individual, group, or contested group context for an environment. The method can also gather data for cohorts where traditional spatial data are absent. In addition, SVG provides a means to spatially capture, map and archive institutional knowledge.ConclusionsSVG GIS output can be used to advance theory by being used as input into qualitative and/or spatial analyses. SVG can also be used to gain near-real time insight therefore supporting applied interventions. Advances over e
- Published
- 2015
22. Locating microseismic sources with a single seismometer channel using coda wave interferometry
- Author
-
Zhao, Youqian, Curtis, Andrew, Baptie, Brian, Zhao, Youqian, Curtis, Andrew, and Baptie, Brian
- Abstract
A novel source location method based on coda wave interferometry (CWI) was applied to a microseismic data set of mining-induced events recorded in Nottinghamshire, England. CWI uses scattered waves in the coda of seismograms to estimate the differences between two seismic states. We used CWI to estimate the distances between pairs of earthquake locations, which are then used jointly to determine the relative location of a cluster of events using a probabilistic framework. We evaluated two improvements to this location technique: These account for the impact of a large difference in the dominant wavelength of a recording made on different instruments, and they standardize the selection of parameters to be used when implementing the method. Although the method has been shown to produce reasonable estimates on larger earthquakes, we tested the method for microseismic events with shorter distinguishable codas in recorded waveforms, and hence, fewer recorded scattered waves. The earthquake location results are highly consistent when using different individual seismometer channels, showing that it is possible to locate event clusters with a single-channel seismometer. We thus extend the potential applications of this cost-effective method to seismic events over a wider range of magnitudes.
- Published
- 2017
23. Transdimensional Love-wave tomography of the British Isles and shear-velocity structure of the East Irish Sea Basin from ambient-noise interferometry
- Author
-
Galetti, Erica, Curtis, Andrew, Baptie, Brian, Jenkins, David, Nicolson, Heather, Galetti, Erica, Curtis, Andrew, Baptie, Brian, Jenkins, David, and Nicolson, Heather
- Abstract
We present the first Love-wave group velocity and shear velocity maps of the British Isles obtained from ambient noise interferometry and fully non-linear inversion. We computed interferometric inter-station Green's functions by cross-correlating the transverse component of ambient noise records retrieved by 61 seismic stations across the UK and Ireland. Group velocity measurements along each possible inter-station path were obtained using frequency-time analysis and converted into a series of inter-station traveltime datasets between 4 and 15 seconds period. Traveltime uncertainties estimated from the standard deviation of dispersion curves constructed by stacking randomly-selected subsets of daily cross-correlations, were observed to be too low to allow reasonable data fits to be obtained during tomography. Data uncertainties were therefore estimated again during the inversion as distance-dependent functionals. We produced Love-wave group velocity maps within 8 different period bands using a fully non-linear tomography method which combines the transdimensional reversible-jump Markov chain Monte Carlo (rj-McMC) algorithm with an eikonal raytracer. By modelling exact raypaths at each step of the Markov chain we ensured that the non-linear character of the inverse problem was fully and correctly accounted for. Between 4 and 10 seconds period, the group velocity maps show remarkable agreement with the known geology of the British Isles and correctly identify a number of low-velocity sedimentary basins and high-velocity features. Longer period maps, in which most sedimentary basins are not visible, are instead mainly representative of basement rocks. In a second stage of our study we used the results of tomography to produce a series of Love-wave group velocity dispersion curves across a grid of geographical points focussed around the East Irish Sea sedimentary basin. We then independently inverted each curve using a similar rj-McMC algorithm to obtain a series of one
- Published
- 2016
24. Uncertainty loops in travel-time tomography from nonlinear wave physics
- Author
-
Galetti, Erica, Curtis, Andrew, Meles, Giovanni Angelo, Baptie, Brian, Galetti, Erica, Curtis, Andrew, Meles, Giovanni Angelo, and Baptie, Brian
- Abstract
Estimating image uncertainty is fundamental to guiding the interpretation of geoscientific tomographic maps.We reveal novel uncertainty topologies (loops) which indicate that while the speeds of both low- and high-velocity anomalies may be well constrained, their locations tend to remain uncertain. The effect is widespread: loops dominate around a third of United Kingdom Love wave tomographic uncertainties, changing the nature of interpretation of the observed anomalies. Loops exist due to 2nd and higher order aspects of wave physics; hence, although such structures must exist in many tomographic studies in the physical sciences and medicine, they are unobservable using standard linearized methods. Higher order methods might fruitfully be adopted.
- Published
- 2015
25. Constructing new seismograms from old earthquakes: retrospective seismology at multiple length scales
- Author
-
Entwistle, Elizabeth, Curtis, Andrew, Galetti, Erica, Baptie, Brian, Meles, Giovanni, Entwistle, Elizabeth, Curtis, Andrew, Galetti, Erica, Baptie, Brian, and Meles, Giovanni
- Abstract
If energy emitted by a seismic source such as an earthquake is recorded on a suitable backbone array of seismometers, source-receiver interferometry (SRI) is a method that allows those recordings to be projected to the location of another target seismometer, providing an estimate of the seismogram that would have been recorded at that location. Since the other seismometer may not have been deployed at the time at which the source occurred, this renders possible the concept of “retrospective seismology” whereby the installation of a sensor at one period of time allows the construction of virtual seismograms as though that sensor had been active before or after its period of installation. Here we construct such virtual seismograms on target sensors in both industrial seismic and earthquake seismology settings, using both active seismic sources and ambient seismic noise to construct SRI propagators, and on length scales ranging over 5 orders of magnitude from ∼40 m to ∼2500 km. In each case we compare seismograms constructed at target sensors by SRI to those actually recorded on the same sensors. We show that spatial integrations required by interferometric theory can be calculated over irregular receiver arrays by embedding these arrays within 2-D spatial Voronoi cells, thus improving spatial interpolation and interferometric results. The results of SRI are significantly improved by restricting the backbone receiver array to include approximately those receivers that provide a stationary-phase contribution to the interferometric integrals. Finally, we apply both correlation-correlation and correlation-convolution SRI and show that the latter constructs fewer nonphysical arrivals.
- Published
- 2015
26. Household-Level Spatiotemporal Patterns of Incidence of Cholera, Haiti, 2011
- Author
-
Blackburn, Jason K., Diamond, Ulrica, Kracalik, Ian T., Widmer, Jocelyn M., Brown, Will, Morrissey, B. David, Alexander, Kathleen A., Curtis, Andrew J., Ali, Afsar, Morris, J. Glenn Jr., Blackburn, Jason K., Diamond, Ulrica, Kracalik, Ian T., Widmer, Jocelyn M., Brown, Will, Morrissey, B. David, Alexander, Kathleen A., Curtis, Andrew J., Ali, Afsar, and Morris, J. Glenn Jr.
- Abstract
A cholera outbreak began in Haiti during October, 2010. Spatiotemporal patterns of household-level cholera in Ouest Department showed that the initial clusters tended to follow major roadways; subsequent clusters occurred further inland. Our data highlight transmission pathway complexities and the need for case and household-level analysis to understand disease spread and optimize interventions.
- Published
- 2014
- Full Text
- View/download PDF
27. A nonlinearly compliant transmission element for force sensing and control
- Author
-
J. Kenneth Salisbury, Jr., Massachusetts Institute of Technology. Department of Mechanical Engineering., Curtis Andrew W., 1970, J. Kenneth Salisbury, Jr., Massachusetts Institute of Technology. Department of Mechanical Engineering., and Curtis Andrew W., 1970
- Abstract
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 2000., Includes bibliographical references (p. 55-56)., by Andrew W. Curtis., S.M.
- Published
- 2014
28. Rayleigh wave tomography of the British Isles from ambient seismic noise
- Author
-
Nicolson, Heather, Curtis, Andrew, Baptie, Brian, Nicolson, Heather, Curtis, Andrew, and Baptie, Brian
- Abstract
We present the first Rayleigh wave group speed maps of the British Isles constructed from ambient seismic noise. The maps also constitute the first surface wave tomography study of the crust under the British Isles at a relatively high resolution. We computed interferometric, interstation Rayleigh waves from vertical component records of ambient seismic noise recorded on 63 broad-band and short-period stations across the UK and Ireland. Group velocity measurements were made from the resulting surface wave dispersion curves between 5 and 25 s using a multiple phase-matched filter method. Uncertainties in the group velocities were computed by calculating the standard deviation of four dispersion curves constructed by stacking a random selection of daily cross-correlations. Where an uncertainty could not be obtained for a ray path using this method, we estimated it as a function of the interreceiver distance. Group velocity maps were computed for 5–25-s period using the Fast Marching forward solution of the eikonal equation and iterative, linearized inversion. At short and intermediate periods, the maps show remarkable agreement with the major geological features of the British Isles including: terrane boundaries in Scotland; regions of late Palaeozoic basement uplift; areas of exposed late Proterozoic/early Palaeozoic rocks in southwest Scotland, northern England and northwest Wales and, sedimentary basins formed during the Mesozoic such as the Irish Sea Basin, the Chester Basin, the Worcester Graben and the Wessex Basin. The maps also show a consistent low-velocity anomaly in the region of the Midlands Platform, a Proterozoic crustal block in the English Midlands. At longer periods, which are sensitive velocities in the lower crustal/upper mantle, the maps suggest that the depth of Moho beneath the British Isles decreases towards the north and west. Areas of fast velocity in the lower crust also coincide with areas thought to be associated with underplating of the lo
- Published
- 2014
29. Household-Level Spatiotemporal Patterns of Incidence of Cholera, Haiti, 2011
- Author
-
Fish and Wildlife Conservation, School of Public and International Affairs, Blackburn, Jason K., Diamond, Ulrica, Kracalik, Ian T., Widmer, Jocelyn M., Brown, Will, Morrissey, B. David, Alexander, Kathleen A., Curtis, Andrew J., Ali, Afsar, Morris, J. Glenn Jr., Fish and Wildlife Conservation, School of Public and International Affairs, Blackburn, Jason K., Diamond, Ulrica, Kracalik, Ian T., Widmer, Jocelyn M., Brown, Will, Morrissey, B. David, Alexander, Kathleen A., Curtis, Andrew J., Ali, Afsar, and Morris, J. Glenn Jr.
- Abstract
A cholera outbreak began in Haiti during October, 2010. Spatiotemporal patterns of household-level cholera in Ouest Department showed that the initial clusters tended to follow major roadways; subsequent clusters occurred further inland. Our data highlight transmission pathway complexities and the need for case and household-level analysis to understand disease spread and optimize interventions.
- Published
- 2014
30. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti
- Author
-
Curtis, Andrew J., Blackburn, Jason K., Widmer, Jocelyn M., Morris, J. Glenn Jr., Curtis, Andrew J., Blackburn, Jason K., Widmer, Jocelyn M., and Morris, J. Glenn Jr.
- Abstract
Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study sites t
- Published
- 2013
- Full Text
- View/download PDF
31. Modeling Hurricane Katrina's merchantable timber and wood damage in south Mississippi using remotely sensed and field-measured data
- Author
-
Collins, Curtis Andrew and Collins, Curtis Andrew
- Subjects
- Timber Hurricane effects Simulation methods. Mississippi, Hurricane Katrina, 2005 Data processing., Severe storms Risk assessment Simulation methods., Severe storms Forecasting. Southern States, Bois d'œuvre Effets des ouragans sur Méthodes de simulation. Mississippi, Ouragan Katrina, 2005 Informatique., Violentes tempêtes Évaluation du risque Méthodes de simulation., Electronic data processing, Severe storms Forecasting, Mississippi, Southern States
- Abstract
Ordinary and weighted least squares multiple linear regression techniques were used to derive 720 models predicting Katrina-induced storm damage in cubic foot volume (outside bark) and green weight tons (outside bark). The large number of models was dictated by the use of three damage classes, three product types, and four forest type model strata. These 36 models were then fit and reported across 10 variable sets and variable set combinations for volume and ton units. Along with large model counts, potential independent variables were created using power transforms and interactions. The basis of these variables was field measured plot data, satellite (Landsat TM and ETM+) imagery, and NOAA HWIND wind data variable types. As part of the modeling process, lone variable types as well as two-type and three-type combinations were examined. By deriving models with these varying inputs, model utility is flexible as all independent variable data are not needed in future applications. The large number of potential variables led to the use of forward, sequential, and exhaustive independent variable selection techniques. After variable selection, weighted least squares techniques were often employed using weights of one over the square root of the pre-storm volume or weight of interest. This was generally successful in improving residual variance homogeneity. Finished model fits, as represented by coefficient of determination (R²), surpassed 0.5 in numerous models with values over 0.6 noted in a few cases. Given these models, an analyst is provided with a toolset to aid in risk assessment and disaster recovery should Katrina-like weather events reoccur.
- Published
- 2013
32. The influence of Teach For America on Algebra I student achievement
- Author
-
Carroll, Curtis Andrew, NC DOCKS at The University of North Carolina at Charlotte, Carroll, Curtis Andrew, and NC DOCKS at The University of North Carolina at Charlotte
- Abstract
This non-experimental study examined the influence of an initiative that High Risk School District (pseudonym) implemented to offset the effect of low student academic performance in low performing-schools. The study attempted to answer the following research question: Does having a Teach For America (TFA) teacher have an influence on a student's Algebra I EOC score, independent of gender and race? Teach For America teachers were assigned to the district's most disenfranchised schools. Previous studies have revealed mixed results on TFA teachers' impact on student achievement. The researcher compared student performance on the Algebra I North Carolina End of Course test in High Risk Schools between TFA and non-TFA classrooms. To analyze the data, the responses were measured using the composite Algebra I EOC scores, and the explanatory variables of student gender (male or female), race (African-American, Hispanic and White) and teacher type (TFA or non-TFA) employing a hierarchical modeling procedure. After considering the nesting nature of students within different schools, the researcher used hierarchical linear modeling and found that students taught by TFA out-performed students taught by non-TFA students t (1956)= 3.23, p=.002. Students taught by TFA teachers for all subgroups White, Black and Hispanic out performed students taught by non-TFA teachers (all ps<.01). The results of this study demonstrate that TFA teachers assigned to Algebra I classes have a significant influence on increasing student achievement. The researcher discusses the limitations of these findings. Other studies have shown that TFA teachers, in comparison to regularly certified teachers, have a negative influence on achievement.
- Published
- 2013
33. The effect of surface conditioning on the bond strength of resin composite to amalgam
- Author
-
Blum, Igor R, Hafiana, Khaula, Curtis, Andrew, Barbour, Michele E, Attin, Thomas, Lynch, Christopher D, Jagger, Daryll C, Blum, Igor R, Hafiana, Khaula, Curtis, Andrew, Barbour, Michele E, Attin, Thomas, Lynch, Christopher D, and Jagger, Daryll C
- Abstract
OBJECTIVES: This study evaluated the effect of different surface conditioning methods on the tensile bond strength (TBS) and integrity of the amalgam-resin composite interface, using commercially available restoration repair systems. METHODS: One hundred and sixty Gamma 2 amalgam specimens were stored in artificial saliva for 2 weeks and then randomly assigned to one of the following conditioning groups (n=20/group): Group 1: air abrasion, alloy primer and 'Panavia 21', Group 2: air abrasion and 'Amalgambond Plus', Group 3: air abrasion and 'All-Bond 3', Group 4: diamond bur, alloy primer and 'Panavia 21', Group 5: diamond bur and 'Amalgambond Plus', Group 6: diamond bur and 'All-Bond 3', Group 7: silica coating technique, and Group 8: non-conditioned amalgam surfaces (control group). Subsequently, resin composite material was added to the substrate surfaces and the amalgam-resin composite specimens were subjected to TBS testing. Representative samples from the test groups were subjected to scanning electron microscopy and surface profilometry. The data was analysed statistically with one-way ANOVA and post hoc Tukey's tests (α=0.05). RESULTS: The mean TBS of amalgam-resin composite ranged between 1.34 and 5.13MPa and varied with the degree of amalgam surface roughness and the type of conditioning technique employed. Significantly highest TBS values (5.13±0.96MPa) were obtained in Group 1 (p=0.013). CONCLUSION: Under the tested conditions, significantly greater tensile bond strength of resin composite to amalgam was achieved when the substrate surface was conditioned by air abrasion followed by the application of the Panavia 21 adhesive system. CLINICAL SIGNIFICANCE: Effecting a repair of an amalgam restoration with resin composite via the use of air abrasion and application of Panavia 21 would seem to enhance the integrity of the amalgam-resin composite interface. Clinical trials involving the implementation of this technique are indicated to determine the usefulne
- Published
- 2012
34. Seismic interferometry and ambient noise tomography in the British Isles
- Author
-
Nicolson, Heather, Curtis, Andrew, Baptie, Brian, Galetti, Erica, Nicolson, Heather, Curtis, Andrew, Baptie, Brian, and Galetti, Erica
- Abstract
Traditional methods of imaging the Earth's subsurface using seismic waves require an identifiable, impulsive source of seismic energy, for example an earthquake or explosive source. Naturally occurring, ambient seismic waves form an ever-present source of energy that is conventionally regarded as unusable since it is not impulsive. As such it is generally removed from seismic data and subsequent analysis. A new method known as seismic interferometry can be used to extract useful information about the Earth's subsurface from the ambient noise wavefield. Consequently, seismic interferometry is an important new tool for exploring areas which are otherwise seismically quiescent, such as the British Isles in which there are relatively few strong earthquakes. One of the possible applications of seismic interferometry is ambient noise tomography (ANT). ANT is a way of using interferometry to image subsurface seismic velocity variations using seismic (surface) waves extracted from the background ambient vibrations of the Earth. To date, ANT has been used successfully to image the Earth's crust and upper-mantle on regional and continental scales in many locations and has the power to resolve major geological features such as sedimentary basins and igneous and metamorphic cores. Here we provide a review of seismic interferometry and ANT, and show that the seismic interferometry method works well within the British Isles. We illustrate the usefulness of the method in seismically quiescent areas by presenting the first surface wave group velocity maps of the Scottish Highlands using only ambient seismic noise. These maps show low velocity anomalies in sedimentary basins such as the Moray Firth, and high velocity anomalies in igneous and metamorphic centres such as the Lewisian complex. They also suggest that the Moho shallows from south to north across Scotland which agrees with previous geophysical studies in the region.
- Published
- 2012
35. Isomorphism of graph classes related to the circular-ones property
- Author
-
Curtis, Andrew R., Lin, Min Chih, McConnell, Ross M., Nussbaum, Yahav, Soulignac, Francisco J., Spinrad, Jeremy P., Szwarcfiter, Jayme L., Curtis, Andrew R., Lin, Min Chih, McConnell, Ross M., Nussbaum, Yahav, Soulignac, Francisco J., Spinrad, Jeremy P., and Szwarcfiter, Jayme L.
- Abstract
We give a linear-time algorithm that checks for isomorphism between two 0-1 matrices that obey the circular-ones property. This algorithm leads to linear-time isomorphism algorithms for related graph classes, including Helly circular-arc graphs, \Gamma-circular-arc graphs, proper circular-arc graphs and convex-round graphs., Comment: 25 pages, 9 figures
- Published
- 2012
36. Frontiers of seismology
- Author
-
Sargeant, Susanne, Ottemoller, Lars, Baptie, Brian, Bell, Andy, Curtis, Andrew, Main, Ian G., Sargeant, Susanne, Ottemoller, Lars, Baptie, Brian, Bell, Andy, Curtis, Andrew, and Main, Ian G.
- Abstract
Frontiers of Seismology was a wide-ranging, cross-disciplinary meeting held in Edinburgh in April this year. Susanne Sargeant, Lars Ottemöller, Brian Baptie, Andy Bell, Andrew Curtis and Ian Main join forces to give a flavour of the meeting and the new strengths it revealed in seismology in the UK.
- Published
- 2009
37. The Academies programme: Progress, problems and possibilities
- Author
-
Curtis, Andrew, Exley, Sonia, Sasia, Amanda, Tough, Sarah, Whitty, Geoff, Curtis, Andrew, Exley, Sonia, Sasia, Amanda, Tough, Sarah, and Whitty, Geoff
- Published
- 2008
38. Primed for Success? The characteristics and practices of state schools with good track records of entry into prestigious UK universities
- Author
-
Curtis, Andrew, Power, Sally, Whitty, Geoff, Exley, Sonia, Sasia, Amanda, Curtis, Andrew, Power, Sally, Whitty, Geoff, Exley, Sonia, and Sasia, Amanda
- Published
- 2008
39. Integrated remotely sensed datasets for disaster management
- Author
-
McCarthy, Tim, Farrell, Ronan, Curtis, Andrew, Fotheringham, Stewart, McCarthy, Tim, Farrell, Ronan, Curtis, Andrew, and Fotheringham, Stewart
- Abstract
Video imagery can be acquired from aerial, terrestrial and marine based platforms and has been exploited for a range of remote sensing applications over the past two decades. Examples include coastal surveys using aerial video, routecorridor infrastructures surveys using vehicle mounted video cameras, aerial surveys over forestry and agriculture, underwater habitat mapping and disaster management. Many of these video systems are based on interlaced, television standards such as North America’s NTSC and European SECAM and PAL television systems that are then recorded using various video formats. This technology has recently being employed as a front-line, remote sensing technology for damage assessment post-disaster. This paper traces the development of spatial video as a remote sensing tool from the early 1980s to the present day. The background to a new spatial-video research initiative based at National University of Ireland, Maynooth, (NUIM) is described. New improvements are proposed and include; low-cost encoders, easy to use software decoders, timing issues and interoperability. These developments will enable specialists and non-specialists collect, process and integrate these datasets within minimal support. This integrated approach will enable decision makers to access relevant remotely sensed datasets quickly and so, carry out rapid damage assessment during and post-disaster.
- Published
- 2008
40. Understanding the Geography of Post-Traumatic Stress: An Academic Justification for Using a Spatial Video Acquisition System in the Response to Hurricane Katrina
- Author
-
Curtis, Andrew, Mills, Jacqueline W., Kennedy, Barrett, Fotheringham, Stewart, McCarthy, Tim, Curtis, Andrew, Mills, Jacqueline W., Kennedy, Barrett, Fotheringham, Stewart, and McCarthy, Tim
- Abstract
In the aftermath of a disaster like Hurricane Katrina, remote-sensing methods are often employed in an effort to assess damage. However, their utility may be limited by the aerial perspective and image resolution. The Spatial Video Acquisition System (SVAS), in conjunction with a Geographic Information System (GIS), has the potential to be a complementary methodology for obtaining damage assessment information as well as capturing recovery related geographies associated with post-traumatic stress. An example is provided from the Lower 9th Ward of New Orleans with data that could be used to predict neighborhood post-traumatic stress. Results reveal six dimensions in which a SVAS can improve existing disaster-related data collection approaches: organization, archiving, transferability, evaluation, objectivity, and feasibility.
- Published
- 2007
41. Understanding social capital : are the problems inherent in Putnam's concept intractable?
- Author
-
Curtis, Andrew. and Curtis, Andrew.
- Abstract
Putnam’s version of social capital, and the main problems with it, are outlined in Chapter One of this thesis. In Chapters Two and Three alternative conceptual approaches are examined to see whether they might resolve any of the difficulties in Putnam’s work. The six problems arising from Putnam’s work identified in Chapter One are: 1) the lack of a developed conceptual framework; 2) whether macro-level analysis is appropriate; 3) how the concept fits with considerations of structure and agency; 4) whether the negative aspects of social capital are fully taken into account; 5) what the relative merits of “bonding” and “bridging” social capital are; and; 6) whether social capital is only ever a by-product of other activities or can also be consciously created. In Chapter Two, Coleman and Ostrom’s separate work in social capital is analysed. They use the concept as part of an attempt to add broader social considerations to theories of rational and collective action. In Chapter Three, the main authors examined are Bourdieu and Lin. Bourdieu uses social capital to complement his concept of cultural capital in looking at the reproduction of inequality. Lin develops a theory of social capital that focuses on individuals’ action in pursuing resources in networks. It emerges that the other authors can contribute various elements that help to address some of the problems in Putnam’s work. Yet the most appropriate level of analysis and the full implications of bridging social capital remain points of contention. In Chapter four the future of Putnam’s use of social capital is debated and it is concluded that he will have to abandon his macro-level analysis if the full conceptual intricacies of social capital are to be realised.
- Published
- 2007
42. Influence of in vitro elaidic acid or trans-vaccenic acid uptake and lactogenic hormone stimulation on fatty acid content of mouse mammary cells
- Author
-
Baughman, Curtis Andrew and Baughman, Curtis Andrew
- Abstract
The objective of the study was to examine the effects of trans-9-octadecenoic acid (elaidic) and trans-II-octadecenoic acid (trans-vaccenic) on uptake and alteration of exogenous fatty acids by mouse mammary epithelial cells. Cells from a subclone of the COMMA-D cell line were plated on uncoated plastic petri dishes and grown to confluence. Supplemental fatty acids bound to bovine serum albumin were added to the medium applied to the confluent cell cultures. Treatments included 200 JlM octadecanoic acid (CI8:0) as a control and 100 JlM CI8:0 with one of the following JlM ratios of cis-octadecenoic acid (cis-CI8:I) to elaidic or trans-vaccenic: 100:0, 75:25, 50:50, 0:100. In addition, all treatments were conducted with or without lactogenic hormone supplementation. The cellular protein to DNA ratio and total amount of fatty acids per mg protein were decreased (P < .05) by addition of lactogenic hormones. In treatments without hormone supplementation, however, the total amount of cellular fatty acids per mg protein was decreased (P < .05) by addition of either trans-CI8:I isomer. Results indicated a significant (P < .05) relationship between the concentration oftrans-C18:1 in the media and uptake of trans-C 18: 1 isomers, and retroconversion of trans-C 18: 1 to trans-C 16: 1. The slopes of the lines for cellular C16:0, cis-C16:1, and cis-CI8:1 were less (P < .05) than zero as concentration oftrans-CI8:1 in the media increased. However, trans-CI8:1 isomers did not influence the proportion of polar and nonpolar lipids synthesized by the cells. It appears that trans fatty acids may depress milk fat output by decreasing de novo fatty acid synthesis and cis-C 18: I content.
- Published
- 1996
43. Marchenko-Lippmann-Schwinger inversion
- Author
-
Cummings, Dominic Gerard, Curtis, Andrew, and Ziolkowski, Antoni
- Subjects
Seismic wave reflections ,Lippmann-Schwinger equation ,Marchenko methods ,linear Lippmann-Schwinger and Born inverse problems ,full scattering Green's function ,Marchenko-Lippmann-Schwinger full waveform inversion ,1D Born inversion ,matrix inversion ,boxcar perturbation ,2D syncline model ,Tikhonov damping - Abstract
Seismic wave reflections recorded at the Earth's surface provide a rich source of information about the structure of the subsurface. These reflections occur due to changes in the material properties of the Earth; in the acoustic approximation, these are the density of the Earth and the velocity of seismic waves travelling through it. Therefore, there is a physical relationship between the material properties of the Earth and the reflected seismic waves that we observe at the surface. This relationship is non-linear, due to the highly scattering nature of the Earth, and to our inability to accurately reproduce these scattered waves with the low resolution velocity models that are usually available to us. Typically, we linearize the scattering problem by assuming that the waves are singly-scattered, requiring multiple reflections to be removed from recorded data at great effort and with varying degrees of success. This assumption is called the Born approximation. The equation that describes the relationship between the Earth's properties and the fully-scattering reflection data is called the Lippmann-Schwinger equation, and this equation is linear if the full scattering wavefield inside the Earth could be known. The development of Marchenko methods makes such wavefields possible to estimate using only the surface reflection data and an estimate of the direct wave from the surface to each point in the Earth. Substituting the results from a Marchenko method into the Lippmann-Schwinger equation results in a linear equation that includes all orders of scattering. The aim of this thesis is to determine whether higher orders of scattering improve the linear inverse problem from data to velocities, by comparing linearized inversion under the Born approximation to the inversion of the linear Lippmann-Schwinger equation. This thesis begins by deriving the linear Lippmann-Schwinger and Born inverse problems, and reviewing the theoretical basis for Marchenko methods. By deriving the derivative of the full scattering Green's function with respect to the model parameters of the Earth, the gradient direction for a new type of least-squares full waveform inversion called Marchenko-Lippmann-Schwinger full waveform inversion is defined that uses all orders of scattering. By recreating the analytical 1D Born inversion of a boxcar perturbation by Beydoun and Tarantola (1988), it is shown that high frequency-sampling density is required to correctly estimate the amplitude of the velocity perturbation. More importantly, even when the scattered wavefield is defined to be singly-scattering and the velocity model perturbation can be found without matrix inversion, Born inversion cannot reproduce the true velocity structure exactly. When the results of analytical inversion are compared to inversions where the inverse matrices have been explicitly calculated, the analytical inversion is found to be superior. All three matrix inversion methods are found to be extremely ill-posed. With regularisation, it is possible to accurately determine the edges of the perturbation, but not the amplitude. Moving from a boxcar perturbation with a homogeneous starting velocity to a many-layered 1D model and a smooth representation of this model as the starting point, it is found that the inversion solution is highly dependent on the starting model. By optimising an iterative inversion in both the model and data domains, it is found that optimising the velocity model misfit does not guarantee improvement in the resulting data misfit, and vice versa. Comparing unregularised inversion to inversions with Tikhonov damping or smoothing applied to the kernel matrix, it is found that strong Tikhonov damping results in the most accurate velocity models. From the consistent under-performance of Lippmann-Schwinger inversion when using Marchenko-derived Green's functions compared to inversions carried out with true Green's functions, it is concluded that the fallibility of Marchenko methods results in inferior inversion results. Born and Lippmann-Schwinger inversion are tested on a 2D syncline model. Due to computational limitations, using all sources and receivers in the inversion required limiting the number of frequencies to 5. Without regularisation, the model update is uninterpretable due to the presence of strong oscillations across the model. With strong Tikhonov damping, the model updates obtained are poorly scaled, have low resolution, and low amplitude oscillatory noise remains. By replacing the inversion of all sources simultaneously with single source inversions, it is possible to reinstate all frequencies within our limited computational resources. These single source model updates can be stacked similarly to migration images to improve the overall model update. As predicted by the 1D analytical inversion, restoring the full frequency bandwidth eliminates the oscillatory noise from the inverse solution. With or without regularisation, Born and Lippmann-Schwinger inversion results are found to be nearly identical. When Marchenko-derived Green's functions are introduced, the inversion results are worse than either the Born inversion or the Lippmann-Schwinger inversion without Marchenko methods. On this basis, one concludes that the inclusion of higher order scattering does not improve the outcome of solving the linear inverse scattering problem using currently available methods. Nevertheless, some recent developments in the methods used to solve the Marchenko equation hold some promise for improving solutions in future.
- Published
- 2023
- Full Text
- View/download PDF
44. Portrait of Robert Manne, Melbourne, 2005
- Author
-
Curtis, Andrew, 1966 and Curtis, Andrew, 1966
- Abstract
Title from caption provided by the photographer.; Acquired in digital format; access copy available online.; Mode of access: Internet via World Wide Web.; Image used on cover of Australian book review, no. 272, June/July 2005.
45. Re-reading the Gospel of Luke today : from a first century urban writing site to a twentieth century urban reading site
- Author
-
Curtis, Andrew and Curtis, Andrew
- Abstract
Postmodern theorising has presented the reader as an active agent in the process of the interpretation of texts. Sociology of knowledge approaches have identified both the author and the reader of texts as socially embodied within a context. This study presents a unique collection of readings in the Gospel of Luke by ordinary real-readers from a disadvantaged and/or marginalised social and ecclesial location, within an affluent first world context. These readings, transcribed in Volume Two, present empirical reader research for analysis, through dialogue and conversation with professional readings in the Gospel of Luke, in order to assess what contribution the former might make to contemporary hermeneutics. Identifying contemporary human experience of ordinary real-readers as the starting point in their reading of the Lukan text, the study illustrates how these readings act as a useful tool of suspicion in conversation with readings that claim to be objective and value-neutral, and how they facilitate critical reflection on the ideological and theological commitments of the dominant classes in society and church. The value and legitimacy of the readings of ordinary real-readers is discussed, and how their social and ecclesial marginalisation and disadvantage provides a nontotalising presence in biblical interpretation, a presence that guards against the claims of permanence made by those in the academic and ecclesial world. Identification of contemporary human experience as inevitably influencing the process of interpretation leads to a consideration of the place of the historical critical paradigm in biblical studies. The value and legitimacy of ordinary real readers as active agents in the process of interpretation, and the contribution they make to contemporary hermeneutics, requires a consideration of safeguards against reading anarchy. The process of self and social analysis, and an openness to dialogue and conversation with those outside our own contexts, incl
46. Re-reading the Gospel of Luke today : from a first century urban writing site to a twentieth century urban reading site
- Author
-
Curtis, Andrew and Curtis, Andrew
- Abstract
Postmodern theorising has presented the reader as an active agent in the process of the interpretation of texts. Sociology of knowledge approaches have identified both the author and the reader of texts as socially embodied within a context. This study presents a unique collection of readings in the Gospel of Luke by ordinary real-readers from a disadvantaged and/or marginalised social and ecclesial location, within an affluent first world context. These readings, transcribed in Volume Two, present empirical reader research for analysis, through dialogue and conversation with professional readings in the Gospel of Luke, in order to assess what contribution the former might make to contemporary hermeneutics. Identifying contemporary human experience of ordinary real-readers as the starting point in their reading of the Lukan text, the study illustrates how these readings act as a useful tool of suspicion in conversation with readings that claim to be objective and value-neutral, and how they facilitate critical reflection on the ideological and theological commitments of the dominant classes in society and church. The value and legitimacy of the readings of ordinary real-readers is discussed, and how their social and ecclesial marginalisation and disadvantage provides a nontotalising presence in biblical interpretation, a presence that guards against the claims of permanence made by those in the academic and ecclesial world. Identification of contemporary human experience as inevitably influencing the process of interpretation leads to a consideration of the place of the historical critical paradigm in biblical studies. The value and legitimacy of ordinary real readers as active agents in the process of interpretation, and the contribution they make to contemporary hermeneutics, requires a consideration of safeguards against reading anarchy. The process of self and social analysis, and an openness to dialogue and conversation with those outside our own contexts, incl
47. Uncertainty quantification in seismic interferometry
- Author
-
Ayala-Garcia, Daniella, Branicki, Michal, and Curtis, Andrew
- Subjects
uncertainty quantification ,seismic interferometry ,stationary phase ,correlated noise ,ambient noise ,wavefield interferometry ,correlation interferometry - Abstract
It is a well-established principle that cross-correlating wavefield observations at different receiver locations yields new responses that, under certain conditions, provide a useful estimate of the Green's function between the receiver locations. This principle, known as wavefield interferometry, is a powerful technique that transforms previously discarded data, such as background noise or earthquake codas (the multiply scattered tails of earthquake seismograms), into useful signals that allow us to remotely illuminate subsurface Earth structures. The mathematical machinery that underlies wavefield interferometry assumes a number of ideal conditions that are not often found in practical settings. Furthermore, the original formulations are frequently simplified through a variety of approximations, in order to derive expressions that are more amenable to applications. These assumptions and approximations are frequently made in an ad-hoc fashion, without consideration of the errors thus introduced. This thesis centres on the study of errors introduced by violating two important assumptions of wavefield interferometry. Namely, that the noise sources are statistically uncorrelated, and that their energy contributions are isotropic. Violating these conditions makes the Green's function and associated phases liable to estimation errors that so far have not been accounted for or corrected. We show that these errors are indeed significant for commonly used noise sources, and illustrate cases in which the errors completely obscure the phase one wishes to retrieve. Moreover, we consider the relevant case of the stationary phase approximation, widely invoked in interferometry theory and applications, and quantify mathematically the errors introduced both in an ideal setting and in the presence of correlated, anisotropic sources, applying and extending existing error quantification theory. Throughout these settings, this thesis implements an appropriate geostatistical correlation model to investigate the effect that smoothness and long-range correlations have on the interferometric estimate, particularly its phase. Analytical expressions are given for the first and second moments of these errors, as well as deterministic error bounds and probability bounds on the uncertainty of these approximations. These results are given in terms of statistical parameters that can be empirically estimated in practice, and numerically explored. Finally, this thesis contrasts the two main types of wavefield interferometry, active or controlled source interferometry and passive or ambient noise interferometry. The impact of violating the uncorrelatedness assumption is considered. This thesis proposes strategies to mitigate uncertainty in both settings, and in the case of ambient noise interferometry, the thesis presents a novel workflow that significantly mitigates errors introduced by the presence of statistical correlations in the sources. The methodology is general in the sense that it can be applied to noise with any degree of correlation, including completely uncorrelated sources. The methods are tested on synthetic data, illustrating significant improvement in the phase estimates in both settings. In all these cases we establish various bounds on the estimation error, and we analyse their significance and utility in real-life interferometric retrieval experiments.
- Published
- 2022
- Full Text
- View/download PDF
48. On primaries-only travel times construction using Marchenko redatuming
- Author
-
Dokter, Eva, Curtis, Andrew, Meles, Giovanni, and Williams, Wyn
- Abstract
Extraction of primaries-only reflection data from full surface reflection data by a combina- tion of Marchenko redatuming and convolutional interferometry, incorporates artefacts, and noise following the constructed primaries. I have shown that the only true information recov- ered are primaries travel times, any dynamic information is artificial. I develop two extended versions of the original workflow, one of which processes seismic data throughout, avoids noise following the primary, and mitigates artefacts typical of this type of primaries construc- tion workflow. The other one uses travel time information as input and output, avoids noise following the primary, and also avoids artefacts by applying a quality control criterion dur- ing the convolutional interferometry step. It is cheaper and faster than the first version, and the resulting 2D primaries travel times matrices need less storage than seismograms. For complex data, it is difficult to extract a sufficient amount of kinematic information to gain an advantage over the full reflection data during velocity analysis and velocity model build- ing. The mechanics of the primaries construction workflow introduce a new class of artificial arrivals, which are indistinguishable from true arrivals. I have used imaging with the true model here to identify them. In addition, the bottom of a thin layer can be omitted in the final data, even if the primary reflection is present in the input data. I have used a horizon- tally layered subsurface model to show that mathematical convergence of the Marchenko redatuming scheme, as measured by the behaviour of convergence energy, is no guarantee for physical convergence towards the correct solution. Inverting for redatuming parameters using the trend of the convergence energy, or related measures applied to the same wave fields, can produce reliable looking results, but these are misleading and mostly false. Since the approach fails for a specific model, it is to be considered unreliable in general.
- Published
- 2022
- Full Text
- View/download PDF
49. Time evolution of the electric field by the rapid expansion method in controlled-source electromagnetic (CSEM) applications
- Author
-
Liu, Yikuo, Ziolkowski, Anton, Hagdorn, Magnus, and Curtis, Andrew
- Subjects
530.14 ,low-frequency, time-domain electromagnetic data ,CSEM ,rapid expansion method ,REM ,time-domain forward modelling ,REM algorithms - Abstract
I address the problem of modelling the low-frequency, time-domain controlled-source electromagnetic (CSEM) data by the rapid expansion method (REM). The CSEM method is an active EM exploration method that is recognized as complementary to the seismic method, with the focus on determining subsurface electric resistivity. Interpretation of CSEM data relies on an iterative forward modelling process to search for the model that best fits the data. Therefore, forward modelling is an essential part of the interpretation process. REM is an explicit time-domain forward modelling method that solves the diffusive EM field based on a Chebyshev expansion of the time operator. The temporal estimator is accurate to the Nyquist frequency and temporal numerical dispersion can be mitigated. I present several extensions of the REM algorithm to generalize its use in various environments. I show the response from the Earth-air interface can be modelled by solving the air field explicitly in the Chebyshev domain. I show that transverse isotropic anisotropy can be included in the modelling with the manipulation of the conductivity tensor. I show that by introducing another fictitious series of Chebyshev polynomials, the updating of Chebyshev terms is equivalent to coupled EM wave equations in a vacuum. EM wavefield modelling techniques can therefore be transferred to the Chebyshev domain, and I show the use of perfectly matched layers, a well-established absorbing boundary condition designed for EM waves, to solve the numerical boundary problems in the Chebyshev method. I have made two improvements to the numerical efficiency of REM modelling of CSEM data. First, I develop a workflow to solve the 3D electric field by REM but with a 2D model. If the earth model can be simplified to 2D structures, the computational cost to achieve a 3D solution can be reduced by an order of magnitude. Secondly, the code has been parallelized by graphic processing units (GPU), and the performance can be improved by a factor of over 100, compared with the serial REM code implemented in C. The developed new functionalities make the REM algorithm an accurate forward modeller that solves the time-domain electric field efficiently in various environments. Subsequent CSEM inversion studies can therefore benefit from the method to extract resistivity model from full-bandwidth CSEM field data, which should bring us closer to the true subsurface.
- Published
- 2021
- Full Text
- View/download PDF
50. Investigating the internal structure of glaciers and ice sheets using Ground Penetrating Radar
- Author
-
Delf, Richard John, Bingham, Robert, Giannopoulos, Antonios, Curtis, Andrew, and Nienow, Peter
- Subjects
glaciology ,geophysics ,ice penetrating radar ,glacier flow modelling - Abstract
Ice penetrating radar (IPR) is a key tool in understanding the internal geometry and nature of glaciers and ice sheets, and has widely been used to derive bed topography, map internal layers and understand the thermal state of the cryosphere. Modern glacier and ice-sheet models facilitate increased assimilation of observations of englacial structure, including glacier thermal state and internal-layer geometry, yet the products available from radar surveys are often under-utilised. This thesis presents the development and assessment of radar processing strategies to improve quantitative retrievals from commonly acquired radar data. The first major focus of this thesis centres on deriving englacial velocities from zero-offset IPR data. Water held within micro- and macro-scale pores in ice has a direct influence on radar velocity, and significantly reduces ice viscosity and hence impacts the long-term evolution of polythermal glaciers. Knowledge of the radar velocity field is essential to retrieve correct bed topography from depth conversion processing, yet bed topography is often estimated assuming constant velocity, and potential errors from lateral variations in the velocity field are neglected. Here I calculate the englacial radar velocity field from common offset IPR data collected on Von Postbreen, a polythermal glacier in Svalbard. I first extract the diffracted wavefield using local coherent stacking, then use the focusing metric of negative entropy to deduce a local migration velocity field from constant-velocity migration panels and produce a glacier-wide model of local radar velocity. I show that this velocity field is successful in differentiating between areas of cold and temperate ice and can detect lateral variations in radar velocity close to the glacier bed. The effects of this velocity field in both migration and depth-conversion of the bed reflection are shown to result in consistently lower ice depths across the glacier, indicating that diffraction focusing and velocity estimation are crucial in retrieving correct bed topography in the presence of temperate ice. For the thesis' second major component I undertake an assessment of automated techniques for tracing and interpreting ice-sheet internal stratigraphy. Radar surveys across ice sheets typically measure numerous englacial layers that can be often be regarded as isochrones. Such layers are valuable for extrapolating age-depth relationships away from ice-core locations, reconstructing palaeoaccumulation variability, and investigating past ice-sheet dynamics. However, the use of englacial layers in Antarctica has been hampered by underdeveloped techniques for characterising layer continuity and geometry over large distances, with techniques developed independently and little opportunity for inter-comparison of results. In this paper, we present a methodology to assess the performance of automated layer-tracking and layer-dip-estimation algorithms through their ability to propagate a correct age-depth model. We use this to assess isochrone-tracking techniques applied to two test case datasets, selected from CreSIS MCoRDS data over Antarctica from a range of environments including low-dip, continuous layers and layers with terminations. We find that dip-estimation techniques are generally successful in tracking englacial dip but break down in the upper and lower regions of the ice sheet. The results of testing two previously published layer-tracking algorithms show that further development is required to attain a good constraint of age-depth relationship away from dated ice cores. I make the recommendation that auto-tracking techniques focus on improved linking of picked stratigraphy across signal disruptions to enable accurate determination of the Antarctic-wide age-depth structure. The final aspect of the thesis focuses on Finite-Difference Time-Domain (FDTD) modelling of IPR data. I present a sliced-3D approach to FDTD modelling, whereby a thin 3D domain is used to replicate modelling of full 3D polarisation while reducing computational cost. Sliced-3D modelling makes use of perfectly matched layer (PML) boundary conditions, and requires tuning of PML parameters to minimise non-physical reflections from the model-PML interface. I investigate the frequency dependence of PML parameters, and establish a relationship between complex frequency stretching parameters and effective wavelength. The resultant parameter choice is shown to minimise propagation errors in the context of a simple radioglaciological model, where 3D domains may be prohibitively large, and for a near-surface cross-borehole survey configuration, a case where full waveform inversion may typically be used.
- Published
- 2021
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.