12,986 results on '"TIME series analysis"'
Search Results
2. Exploring [formula omitted] capture and its activation with computational integration
- Author
-
Sadhukhan, Suchetana and Yadav, Vivek Kumar
- Published
- 2024
- Full Text
- View/download PDF
3. Evaluating the sustainability of groundwater abstraction in small watersheds using time series analysis
- Author
-
Mahmood, Muhammad Qasim, Wang, Xiuquan, Aziz, Farhan, and Pang, Tianze
- Published
- 2024
- Full Text
- View/download PDF
4. Unraveling aquifer dynamics: Time series evaluation for informed groundwater management
- Author
-
Samani, Saeideh
- Published
- 2024
- Full Text
- View/download PDF
5. On the time series analysis of resistive switching devices
- Author
-
Thorat, Parth S., Kumbhar, Dhananjay D., Oval, Ruchik D., Kumar, Sanjay, Awale, Manik, Ramanathan, T.V., Khot, Atul C., Kim, Tae Geun, Dongale, Tukaram D., and Sutar, Santosh S.
- Published
- 2024
- Full Text
- View/download PDF
6. Forecasting shipbuilding demand using shipping market modeling: A case study of LNGC
- Author
-
Han, Seung Woo, Kwak, Dong Hoon, Byeon, Geon-woong, and Woo, Jong Hun
- Published
- 2024
- Full Text
- View/download PDF
7. Construction of a probabilistic finite state automaton by entropy reduction over context trees
- Author
-
Santos, Higor Í., Chaves, Daniel P.B., and Pimentel, Cecilio
- Published
- 2025
- Full Text
- View/download PDF
8. Optimizing hybrid models for canopy nitrogen mapping from Sentinel-2 in Google Earth Engine.
- Author
-
De Clerck, Emma, D.Kovács, Dávid, Berger, Katja, Schlerf, Martin, and Verrelst, Jochem
- Subjects
- *
KRIGING , *TIME series analysis , *MULTISPECTRAL imaging , *LEAST squares , *K-nearest neighbor classification , *PARTIAL least squares regression - Abstract
Canopy nitrogen content (CNC) is a crucial variable for plant health, influencing photosynthesis and growth. An optimized, scalable approach for spatially explicit CNC quantification using Sentinel-2 (S2) data is presented, integrating PROSAIL-PRO simulations with Gaussian Process Regression (GPR) and an Active Learning technique, specifically the Euclidean distance-based diversity (EBD) approach for selective sampling. This hybrid method enhances training dataset efficiency and optimizes CNC models for practical applications. Two GPR models based on PROSAIL-PRO variables were evaluated: a protein-based model (C prot -LAI) and a chlorophyll-based model (C ab -LAI). Both models, implemented in Google Earth Engine (GEE), demonstrated strong performance and outperformed other machine learning methods, including kernel ridge regression, principal component regression, neural network, weighted k-nearest neighbors regression, partial least squares regression and least squares linear regression. Validation results showed moderate to good accuracies: NRMSE C prot − LAI = 16.76%, R C prot − LAI 2 = 0.47; NRMSE C ab − LAI = 18.74%, R C ab − LAI 2 = 0.51. The models revealed high consistency for an independent validation dataset of the Munich-North-Isar (Germany) test site, with R 2 values of 0.58 and 0.71 and NRMSEs of 21.47% and 20.17% for the C prot -LAI model and C ab -LAI model, respectively. The models also demonstrated high consistency across growing seasons, indicating their potential for time series analysis of CNC dynamics. Application of the S2-based mapping workflow across the Iberian Peninsula, with estimates showing relative uncertainty below 30%, highlights the model's broad applicability and portability. The optimized EBD-GPR-CNC approach within GEE supports scalable CNC estimation and offers a robust tool for monitoring nitrogen dynamics. • Sentinel-2 bands can estimate canopy nitrogen content over croplands. • RTM based training data is optimized with Euclidean distance-based active learning. • CNC can be estimated with the protein-N and with the chlorophyll-N relation. • CNC is quantified and mapped over cropland using a GPR model with uncertainties • Operational monitoring of CNC becomes possible with hybrid models implemented in GEE. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. The decrease in alcohol consumption and suicide rate during the COVID-19 pandemic and their association.
- Author
-
Kim, Agnus M. and Lee, Jin-Seok
- Subjects
- *
COVID-19 pandemic , *SUICIDE statistics , *ALCOHOL drinking , *DEATH rate , *TIME series analysis - Abstract
Despite the considerable change in alcohol consumption during the COVID-19 pandemic, the impact of the pandemic on the suicide rate in terms of alcohol consumption was not studied. This study was performed to examine whether the change in the suicide rate during the COVID-19 pandemic was related to alcohol consumption and whether the relation was specific to suicides when compared to mortality due to other causes. We performed a comparative interrupted time series (CITS) analysis for the suicide rate of people aged 19 to 60 with three comparison groups (the suicide rate of people aged 19 and under, the cancer death rate of people aged 19 to 60, and alcohol-induced death rates). The suicide rate of people aged 19 to 60 and alcohol consumption per capita, along with alcohol-induced death rates, continued to decrease during the pandemic in 2020 and 2021, while the suicide rate of people aged 19 and under and the cancer death rate showed increases. In the comparative interrupted time series model, alcohol consumption had an increasing effect on the adult suicide rate compared to comparison groups when time trends and changes associated with COVID-19 were adjusted. This study shows that the decrease in the adult suicide rate in Korea during the pandemic was associated with the decrease in alcohol use among the adult population. Considering that means restriction is the most effective way of controlling suicide and that alcohol can be the most potent and final trigger for suicide, the decrease in suicides during the pandemic and its association with alcohol consumption should be understood as a call for further efforts to decrease alcohol consumption. • Despite the considerable change in alcohol use during the pandemic, its relationship with suicide rate was not studied. • The decrease in adult suicide rate in Korea during the pandemic was associated with the decrease in alcohol use. • Our findings suggest a need for further efforts to decrease alcohol consumption in order to reduce suicides. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. On the observed time evolution of cosmic rays in a new time domain.
- Author
-
Varotsos, C.A., Golitsyn, G.S., Mazei, Y., Sarlis, N.V., Xue, Y., Mavromichalaki, H., and Efstathiou, M.N.
- Subjects
- *
MAXIMUM entropy method , *TIME series analysis , *OPEN-ended questions , *PHYSICS , *DATA analysis - Abstract
Since the 1990's, it has been recognized that the full explanation of cosmic rays (CR) and their spectrum may require some new physics. The debate on the origin of CR has led to the conclusion that while most CR come from supernova explosions in the Galaxy, CR with very high energies are likely of extragalactic origin. However, a response to several open questions, still unanswered, concerning CR above 1013 eV is required. We herewith study the temporal evolution of the observational CR using data collected by several stations of the ground-based network. The obtained result states that the power spectral density of the CR temporal evolution, especially with a frequency less than 0.1 Hz, exhibits the Kolmogorov-Obukhov 5/3 law that exhibits the energy spectrum of many geophysical quantities. Any small difference found from the 5/3 exponent can be attributed to intermittency corrections and the stations' characteristics. Moreover, natural time analysis applied to the CR time series showed the critical role of the quasi-biennial oscillation to the entropy maximization which occurs following the 5/3 Kolmogorov-Obukhov power law. These findings can be used to more reliably predict extreme CR events that could have an impact even at the molecular level. • The power spectral density of the observed time evolution of CRs obeys the 5/3 law. • The CRs entropy is maximized following the Kolmogorov-Obukhov 5/3-power law. • Small differences from the 5/3 law can be attributed to intermittency corrections. • The Natural Time Analysis of CRs data can lead to the prediction of its extremes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. A phenological-knowledge-independent method for automatic paddy rice mapping with time series of polarimetric SAR images.
- Author
-
Lin, Suya, Qi, Zhixin, Li, Xia, Zhang, Hui, Lv, Qianwen, and Huang, Di
- Subjects
- *
SYNTHETIC aperture radar , *TIME series analysis , *RICE , *LAND cover , *AGRICULTURE , *PADDY fields , *PLANT phenology - Abstract
Paddy rice, which sustains more than half of the global population, requires accurate and efficient mapping to ensure food security. Synthetic aperture radar (SAR) has become indispensable in this process due to its remarkable ability to operate effectively in adverse weather conditions and its sensitivity to paddy rice growth. Phenological-knowledge-based (PKB) methods have been commonly employed in conjunction with time series of SAR images for paddy rice mapping, primarily because they eliminate the need for training datasets. However, PKB methods possess inherent limitations, primarily stemming from their reliance on precise phenological information regarding paddy rice growth. This information varies across regions and paddy rice varieties, making it challenging to use PKB methods effectively on a large spatial scale, such as the national or global scale, where collecting comprehensive phenological data becomes impractical. Moreover, variations in farming practices and field conditions can lead to differences in paddy rice growth stages even within the same region. Using a generalized set of phenological knowledge in PKB methods may not be suitable for all paddy fields, potentially resulting in errors in paddy rice extraction. To address the challenges posed by PKB methods, this study proposed an innovative approach known as the phenological-knowledge-independent (PKI) method for mapping paddy rice using time series of Sentinel-1 SAR images. The central innovation of the PKI method lies in its capability to map paddy rice without relying on specific knowledge of paddy rice phenology or the need for a training dataset. This was made possible by the incorporation of three novel metrics: VH and VV normalized maximum temporal changes (NMTC) and VH temporal mean, derived from the distinctions between paddy rice and other land cover types in time series of SAR images. The PKI method was rigorously evaluated across three regions in China, each featuring different paddy rice varieties. Additionally, the PKI method was compared with two prevalent phenological-knowledge-based techniques: the automated paddy rice mapping method using SAR flooding signals (ARM-SARFS) and the manual interpretation of unsupervised clustering results (MI-UCR). The PKI method achieved an average overall accuracy of 97.99%, surpassing the ARM-SARFS, which recorded an accuracy of 89.65% due to errors stemming from phenological disparities among different paddy fields. Furthermore, the PKI method delivered results on par with the MI-UCR, which relied on the fusion of SAR and optical image time series, achieving an accuracy of 97.71%. As demonstrated by these findings, the PKI method proves highly effective in mapping paddy rice across diverse regions, all without the need for phenological knowledge or a training dataset. Consequently, it holds substantial promise for efficiently mapping paddy rice on a large spatial scale. The source code used in this study is available at https://code.earthengine.google.com/f82cf10cad64fa3f971ae99027001a6e. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. Using difference features effectively: A multi-task network for exploring change areas and change moments in time series remote sensing images.
- Author
-
Li, Jialu and Wu, Chen
- Subjects
- *
OPTICAL flow , *FEATURE extraction , *REMOTE sensing , *TIME series analysis , *DEEP learning - Abstract
With the rapid advancement in remote sensing Earth observation technology, an abundance of Time Series multispectral remote sensing Images (TSIs) from platforms like Landsat and Sentinel-2 are now accessible, offering essential data support for Time Series remote sensing images Change Detection (TSCD). However, TSCD faces misalignment challenges due to variations in radiation incidence angles, satellite orbit deviations, and other factors when capturing TSIs at the same geographic location but different times. Furthermore, another important issue that needs immediate attention is the precise determination of change moments for change areas within TSIs. To tackle these challenges, this paper proposes Multi-RLD-Net, a multi-task network that efficiently utilizes difference features to explore change areas and corresponding change moments in TSIs. To the best of our knowledge, this is the first time that using deep learning for identifying change moments in TSIs. Multi-RLD-Net integrates Optical Flow with Long Short-Term Memory (LSTM) to derive differences between TSIs. Initially, a lightweight encoder is introduced to extract multi-scale spatial features, which maximally preserve original features through a siamese structure. Subsequently, shallow spatial features extracted by the encoder are input into the novel Recursive Optical Flow Difference (ROD) module to align input features and detect differences between them, while deep spatial features extracted by the encoder are input into LSTM to capture long-term temporal dependencies and differences between hidden states. Both branches output differences among TSIs, enhancing the expressive capacity of the model. Finally, the decoder identifies change areas and their corresponding change moments using multi-task branches. Experiments on UTRNet dataset and DynamicEarthNet dataset demonstrate that proposed RLD-Net and Multi-RLD-Net outperform representative approaches, achieving F1 value improvements of 1.29% and 10.42% compared to the state-of-the art method MC2ABNet. The source code will be available soon at https://github.com/lijialu144/Multi-RLD-Net. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Reconstructing NDVI time series in cloud-prone regions: A fusion-and-fit approach with deep learning residual constraint.
- Author
-
Qin, Peng, Huang, Huabing, Chen, Peimin, Tang, Hailong, Wang, Jie, and Chen, Shuang
- Subjects
- *
NORMALIZED difference vegetation index , *LANDSAT satellites , *LAND cover , *TIME series analysis , *STATISTICAL correlation - Abstract
The time series data of Normalized Difference Vegetation Index (NDVI) is crucial for monitoring changes in terrestrial vegetation. Existing reconstruction methods encounter challenges in areas prone to clouds, primarily due to inadequate utilization of spatial, temporal, periodic, and multi-sensor information, as well as a lack of physical interpretations. This frequently results in limited model performance or the omission of spatial details when predicting scenarios involving land cover changes. In this study, we propose a novel approach named Residual (Re) Constraints (Co) fusion-and-fit (ReCoff), consisting of two steps: ReCoF fusion (F) and Savitzky-Golay (SG) fit. This approach addresses the challenges of reconstructing 30 m Landsat NDVI time series data in cloudy regions. The fusion-fit process captures land cover changes and maps them from MODIS to Landsat using a deep learning model with residual constraints, while simultaneously integrating multi-dimensional, multi-sensor, and long time-series information. ReCoff offers three distinct advantages. First, the fusion results are more robust to land cover change scenarios and contain richer spatial details (RMSE of 0.091 vs. 0.101, 0.164, and 0.188 for ReCoF vs. STFGAN, FSDAF, and ESTARFM). Second, ReCoff improves the effectiveness of reconstructing dense time-series data (2016–2020, 16-day interval) in cloudy areas, whereas other methods are more susceptible to the impact of prolonged data gaps. ReCoff achieves a correlation coefficient of 0.84 with the MODIS reference series, outperforming SG (0.28), HANTS (0.32), and GF-SG (0.48). Third, with the help of the GEE platform, ReCoff can be applied over large areas (771 km × 634 km) and long-time scales (bimonthly intervals from 2000 to 2020) in cloudy regions. ReCoff demonstrates potential for accurately reconstructing time-series data in cloudy areas. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Recursive classification of satellite imaging time-series: An application to land cover mapping.
- Author
-
Calatrava, Helena, Duvvuri, Bhavya, Li, Haoqing, Borsoi, Ricardo, Beighley, Edward, Erdoğmuş, Deniz, Closas, Pau, and Imbiriba, Tales
- Subjects
- *
GAUSSIAN mixture models , *CYANOBACTERIAL blooms , *TIME series analysis , *REMOTE-sensing images , *IMAGE recognition (Computer vision) , *LAND cover - Abstract
Despite the extensive body of literature focused on remote sensing applications for land cover mapping and the availability of high-resolution satellite imagery, methods for continuously updating classification maps in real-time remain limited, especially when training data is scarce. This paper introduces the recursive Bayesian classifier (RBC), which converts any instantaneous classifier into a robust online method through a probabilistic framework that is resilient to non-informative image variations. Three experiments are conducted using Sentinel-2 data: water mapping of the Oroville Dam in California and the Charles River basin in Massachusetts, and deforestation detection in the Amazon. RBC is applied to a Gaussian mixture model (GMM), logistic regression (LR), and our proposed spectral index classifier (SIC). Results show that RBC significantly enhances classifier robustness in multitemporal settings under challenging conditions, such as cloud cover and cyanobacterial blooms. Specifically, balanced classification accuracy improves by up to 26.95% for SIC, 12.4% for GMM, and 13.81% for LR in water mapping, and by 15.25%, 14.17%, and 14.7% in deforestation detection. Moreover, without additional training data, RBC improves the performance of the state-of-the-art DeepWaterMap and WatNet algorithms by up to 9.62% and 11.03%. These benefits are provided by RBC while requiring minimal supervision and maintaining a low computational cost that remains constant for each time step regardless of the time-series length. [Display omitted] [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. An automatic procedure for mapping burned areas globally using Sentinel-2 and VIIRS/MODIS active fires in Google Earth Engine.
- Author
-
Bastarrika, Aitor, Rodriguez-Montellano, Armando, Roteta, Ekhi, Hantson, Stijn, Franquesa, Magí, Torre, Leyre, Gonzalez-Ibarzabal, Jon, Artano, Karmele, Martinez-Blanco, Pilar, Mesanza, Amaia, Anaya, Jesús A., and Chuvieco, Emilio
- Subjects
- *
FOREST fires , *TROPICAL forests , *TEMPERATE forests , *INFRARED imaging , *TIME series analysis - Abstract
• An automatic burned area mapping algorithm based on active fire data and Sentinel-2 Level 2A imagery is presented. • Good concordance of the proposed algorithm with reference sources. • Global commission errors are higher than omission errors. • Forest fires in tropical and temperate forest are the least accurately mapped ecosystems. • Greater accuracy of the proposed algorithm in comparison to FIRECCI51 and MCD64A1. Understanding the spatial and temporal trends of burned areas (BA) on a global scale offers a comprehensive view of the underlying mechanisms driving fire incidence and its influence on ecosystems and vegetation recovery patterns over extended periods. Such insights are invaluable for modeling fire emissions and the formulation of strategies for post-fire rehabilitation planning. Previous research has provided strong evidence that current global BA products derived from coarse spatial resolution data underestimates global burned areas. Consequently, there is a pressing need for global high-resolution BA products. Here, we present an automatic global burned area mapping algorithm (Sentinel2BAM) based on Sentinel-2 Level-2A imagery combined with Visible Infrared Imaging Radiometer Suite (VIIRS) and Moderate Resolution Imaging Spectrometer (MODIS) active fire data. The algorithm employs a Random Forest Model trained by active fires to predict BA probabilities in each 5-day Normalized Burn Ratio (NBR) index-based temporal composites. In a second step, a time-series and object-based analysis of the estimated BA probabilities allows burned areas to be detected on a quarterly basis. The algorithm was implemented in Google Earth Engine (GEE) and applied to 576 Sentinel-2 tiles corresponding to 2019, distributed globally, to assess its ability to map burned areas across different ecosystems. Two validation sources were employed: 21 EMSR Copernicus Emergency Service perimeters obtained using high spatial resolution (<10 m) data (EMSR21) located in the Mediterranean basin and 50 20x20 km global samples selected by stratified sampling with Sentinel-2 at 10 m spatial resolution (GlobalS50). Additionally, 105 Landsat-based long sample units (GlobalL105), were employed to compare the performance of the Sentinel2BAM algorithm against the FIRECCI51 and MCD64A1 global products. Overall accuracy metrics for the Sentinel2BAM algorithm, derived from validation sources highlight higher commission (CE) than omission (OE) errors (CE=10.3 % and OE=7.6 % when using EMSR21 as reference, CE=18.9 % and OE=9.5 % when using Global S50 as reference), while GlobalL105-based inferenced global comparison metrics show similar patterns (CE=22.5 % and OE=13.4 %). Results indicate differences across ecosystems: forest fires in tropical and temperate biomes exhibit higher CE, mainly due to confusion between burned areas and croplands. According to GlobalL105, Sentinel2BAM shows greater accuracy globally (CE=22.5 %, OE=13.4 %) compared to FIRECCI51 (CE=20.8 %, OE=46.5 %) and MCD64A1 (CE=17.5 %, OE=53.1 %), substantially improving the detection of small fires and thereby reducing omission errors. The strengths and weaknesses of the algorithm are thoroughly addressed, demonstrating its potential for global application. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Sequential polarimetric phase optimization algorithm for dynamic deformation monitoring of landslides.
- Author
-
Wang, Yian, Luo, Jiayin, Dong, Jie, Mallorqui, Jordi J., Liao, Mingsheng, Zhang, Lu, and Gong, Jianya
- Subjects
- *
TIME series analysis , *OPTIMIZATION algorithms , *MAXIMUM likelihood statistics , *STATISTICAL sampling , *SUPPLY & demand , *LANDSLIDES - Abstract
In the era of big SAR data, it is urgent to develop dynamic time series DInSAR processing procedures for near-real-time monitoring of landslides. However, the dense vegetation coverage in mountainous areas causes severe decorrelations, which demands high precision and efficiency of phase optimization processing. The common phase optimization using single-polarization SAR data cannot produce satisfactory results due to the limited statistical samples in some natural scenarios. The novel polarimetric phase optimization algorithms, however, have low computational efficiency, limiting their applications in large-scale scenarios and long data sequences. In addition, temporal changes in the scattering properties of ground features and the continuous increase of SAR data require dynamic phase optimization processing. To achieve efficient phase optimization for dynamic DInSAR time series analysis, we combine the Sequential Estimator (SE) with the Total Power (TP) polarization stacking method and solve it using eigen decomposition-based Maximum Likelihood Estimator (EMI), named SETP-EMI. The simulation and real data experiments demonstrate the significant improvements of the SETP-EMI method in precision and efficiency compared to the EMI and TP-EMI methods. The SETP-EMI exhibits an increase of more than 50% and 20% in highly coherent points for the real data compared to the EMI and TP-EMI, respectively. It, meanwhile, achieves approximately six and two times more efficient than the EMI and TP-EMI methods in the real data case. These results highlight the effectiveness of the SETP-EMI method in promptly capturing and analyzing evolving landslide deformations, providing valuable insights for real-time monitoring and decision-making. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Evaluating the performance of herd-specific long short-term memory models to identify automated health alerts associated with a ketosis diagnosis in early-lactation cows.
- Author
-
Taechachokevivat, N., Kou, B., Zhang, T., Montes, M.E., Boerman, J.P., Doucette, J.S., and Neves, R.C.
- Subjects
- *
MACHINE learning , *MILK yield , *ARTIFICIAL intelligence , *DAIRY farms , *TIME series analysis , *MILK quality - Abstract
The list of standard abbreviations for JDS is available at adsa.org/jds-abbreviations-24. Nonstandard abbreviations are available in the Notes. The growing use of automated systems in the dairy industry generates a vast amount of cow-level data daily, creating opportunities for using these data to support real-time decision-making. Currently, various commercial systems offer built-in alert algorithms to identify cows requiring attention. To our knowledge, no work has been done to compare the use of models accounting for herd-level variability on their predictive ability against automated systems. Long short-term memory (LSTM) models are machine learning models capable of learning temporal patterns and making predictions based on time series data. The objective of our study was to evaluate the ability of LSTM models to identify a health alert associated with a ketosis diagnosis (HAK) using deviations of daily milk yield, milk fat-to-protein ratio (FPR), number of successful milkings, rumination time, and activity index from the herd median by parity and DIM, considering various time series lengths and numbers of days before HAK. Additionally, we aimed to use Explainable Artificial Intelligence method to understand the relationships between input variables and model outputs. Data on daily milk yield, milk FPR, number of successful milkings, rumination time, activity, and health events during 0 to 21 DIM were retrospectively obtained from a commercial Holstein dairy farm in northern Indiana from February 2020 to January 2023. A total of 1,743 cows were included in the analysis (non-HAK = 1,550; HAK = 193). Variables were transformed based on deviations from the herd median by parity and DIM. Six LSTM models were developed to identify HAK 1, 2, and 3 d before farm diagnosis using historic cow-level data with varying time series lengths. Model performance was assessed using repeated stratified 10-fold cross-validation for 20 repeats. The Shapley additive explanations framework (SHAP) was used for model explanation. Model accuracy was 83%, 74%, and 70%; balanced error rate was 17% to 18%, 26% to 28%, and 34%; sensitivity was 81% to 83%, 71% to 74%, and 62%; specificity was 83%, 74%, and 71%; positive predictive value was 38%, 25% to 27%, and 21%; negative predictive value was 97% to 98%, 95% to 96%, and 94%; and area under the curve was 0.89 to 0.90, 0.80 to 0.81, and 0.72 for models identifying HAK 1, 2, and 3 d before diagnosis, respectively. Performance declined as the time interval between identification and farm diagnosis increased, and extending the time series length did not improve model performance. Model explanation revealed that cows with lower milk yield, number of successful milkings, rumination time, and activity, and higher milk FPR compared with herdmates of the same parity and DIM were more likely to be classified as HAK. Our results demonstrate the potential of LSTM models in identifying HAK using deviations of daily milk production variables, rumination time, and activity index from the herd median by parity and DIM. Future studies are needed to evaluate the performance of health alerts using LSTM models controlling for herd-specific metrics against commercial built-in algorithms in multiple farms and for other disorders. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Residual loneliness in the Netherlands after the COVID-19 pandemic: An application of the single interrupted time series design with pre-, peri- and post-pandemic observations.
- Author
-
van Tilburg, Theo G.
- Subjects
- *
PSYCHOLOGICAL resilience , *LONELINESS , *TIME series analysis , *DISEASE prevalence , *COVID-19 pandemic - Abstract
During the COVID-19 pandemic, many countries implemented policies to physically separate citizens. As a consequence, an increased prevalence of loneliness was observed. This article examined whether the prevalence of loneliness in the Netherlands has returned to pre-pandemic levels after the restrictive policy was ended. We studied age differences in the course of loneliness. Single interrupted time series design. Data were from the Longitudinal Internet Studies for the Social Sciences (age range 16–102 years) and the Longitudinal Aging Study Amsterdam (age range 65–101 years). Both studies included respondents sampled from the Dutch population registers. Data collected relatively soon and later after the pandemic outbreak (nine and five observations in 2020–2023, respectively) were compared to extrapolated trend data based on a long period of time before the outbreak (since 2008 and 1992, respectively). With two exceptions, the results of the two studies including five age categories and three types of loneliness measurement instruments showed that after an increased prevalence during the pandemic, prevalence at the last observation was at or below the level of the extrapolated trend. It is highly likely that the pandemic was indeed an interruption and not a fundamental trend change in loneliness. This shows individuals' resilience and the ability to reactivate social ties after the interruptive pandemic. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. The cost of obesity and related NCDs in Brazil: An analysis of hospital admissions, disability retirement benefits, and statutory sick pay.
- Author
-
Abbade, Eduardo Botti
- Subjects
- *
RETIREMENT & economics , *DISABILITY insurance , *SOCIAL security , *PEARSON correlation (Statistics) , *SICK leave , *BODY mass index , *PATIENTS , *HOSPITAL care , *HOSPITAL admission & discharge , *TIME series analysis , *DESCRIPTIVE statistics , *NON-communicable diseases , *PUBLIC health , *OBESITY , *ECONOMIC aspects of diseases , *REGRESSION analysis , *NOSOLOGY , *MEDICAL care costs , *ECONOMICS - Abstract
This study analyses the prevalence of overweight/obesity in Brazil, and its costs regarding hospital admissions (HA), disability retirement benefits (DRB), and statutory sick pay (SSP) associated with obesity-related non-communicable diseases (NCDs). Time-series study. This study analyses data from the VIGITEL system (2010–2019) to calculate the body-mass index (BMI) of adult residents in Brazil's state capitals. Data on HA, DRB, and SSP were obtained from Brazil's SIH/SUS and AEPS Infologo systems. Pearson's correlation and linear regression models were applied. The study selected 23 diseases of the International Classification of Disease (ICDs) belonging to chapters C; E; I; and K. Cost values in BRL were deflated using IPCA. The results showed a significant increase in overweight and obesity rates in Brazil, with BMI rising by 0.09 kg/m2 annually. Regression analysis revealed that each 1-point increase in the average BMI of the population is associated with an increase of 81,772 (BRL 237.51 million/year) new HA per year, 5541 (BRL 18.8 million/year) new DRB granted per year, and 42,360 (BRL 131 million/year) new SSP per year. Also, every 1 % increase in the share of the Brazilian population with obesity is associated with an increase of 16,973 (BRL 48.8 million/year) new HA per year, 1202 (BRL 3.97 million/year) new DRB granted per year, and 8686 (BRL 26.8 million/year) new SSP per year. Regressions for deflated values showed lower significance, suggesting a strong impact of inflation on health costs in Brazil. Obesity prevalence in Brazil implies high direct and indirect costs for the Brazilian government, especially considering circulatory system diseases. • Obesity prevalence in Brazil implies high direct and indirect costs for public health. • A 1-point increase in population BMI leads to a USD 62 million rise in new hospital admissions for the public health system. • A 1% rise in Brazil's obesity rate adds USD 12.3 million in new hospital admissions through the public health system. • For every 1-point population BMI increase may cost the Brazilian government BRL 387.3 million more annually across 23 ICDs. • Obesity in Brazil drives rising direct and indirect costs, especially for circulatory diseases. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Treatment of epistemic uncertainty in conjunction analysis with Dempster-Shafer theory.
- Author
-
Sánchez, Luis, Vasile, Massimiliano, Sanvido, Silvia, Merz, Klaus, and Taillan, Christophe
- Subjects
- *
DEMPSTER-Shafer theory , *EPISTEMIC uncertainty , *TIME series analysis , *STATISTICS , *DATABASES - Abstract
• New model of epistemic uncertainty in Conjunction Data Messages. • Combination of Dvoretzky–Kiefer–Wolfowitz inequality and Dempster-Shafer theory. • New robust classification system for conjunction events. • Validation of the robust classification system against real conjunction scenarios. • Statistical analysis of high-risk and uncertain events detection in a real database. The paper presents an approach to the modelling of epistemic uncertainty in Conjunction Data Messages (CDM) and the classification of conjunction events according to the confidence in the probability of collision. The approach proposed in this paper is based on Dempster-Shafer Theory (DSt) of evidence and starts from the assumption that the observed CDMs are drawn from a family of unknown distributions. The Dvoretzky–Kiefer–Wolfowitz (DKW) inequality is used to construct robust bounds on such a family of unknown distributions starting from a time series of CDMs. A DSt structure is then derived from the probability boxes constructed with DKW inequality. The DSt structure encapsulates the uncertainty in the CDMs at every point along the time series and allows the computation of the belief and plausibility in the realisation of a given probability of collision. The methodology proposed in this paper is tested on a number of real events and compared against existing practices in the European and French Space Agencies. We will show that the classification system proposed in this paper is more conservative than the approach taken by the European Space Agency but provides an added quantification of uncertainty in the probability of collision. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Occurrence of sudden storm commencement in interplanetary space.
- Author
-
Singh, Y.P., Badruddin, B., and Agarwal, S.
- Subjects
- *
SOLAR activity , *COSMIC rays , *WAVELETS (Mathematics) , *TIME series analysis ,ROTATION of the Sun - Abstract
Sudden storm commencement (SSC) events detected in interplanetary space from 1869 to 2023, are subjected to wavelet and Lomb-Scargle periodogram analyses to determine the potential frequency of their recurrence. This period includes SSC events from the preceding fourteen solar cycles. Using Lomb-Scargle periodogram analysis, we have identified SSC occurrences independently during the odd and even solar activity cycles. We also used the wavelet analysis approach throughout the positive (A > 0) and negative (A < 0) polarity states of the heliospheric magnetic fields, in addition to solar activity cycles. This study reveals some notable short-term periodic variations in the SSC time series. The findings indicate that SSC occurrences have a conspicuous and important ∼ 44.0-day period. We have also observed solar and extended solar rotation periods in the time series, along with some intermittent variations (e.g., ∼22.0-, ∼19.0-days). The observed periodicities in the SSC time series are discussed and compared with the previously detected periodicities of solar wind and plasma, geomagnetic activity, and cosmic rays' intensity indicators. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Abnormal behavior detection in industrial control systems based on CNN.
- Author
-
Chen, Jingzhao, Liu, Bin, and Zuo, Haowen
- Subjects
INDUSTRIAL controls manufacturing ,FEATURE extraction ,INTERNET of things ,TIME series analysis ,GENERALIZATION - Abstract
With the widespread application of Internet of Things technology in industrial control systems, abnormal behavior detection has become a key task to ensure the safety and stable operation of the system. We propose a multi-branch convolutional fusion neural network method to improve the accuracy and efficiency of abnormal behavior detection in Internet of Things industrial control systems. This method achieves efficient detection of abnormal behavior in video data by combining ResNet152 for spatial feature extraction and GRU for temporal feature extraction. Unlike traditional methods, this method adopts a multi-branch structure, which can simultaneously capture multi-scale feature information, significantly enhancing the richness of feature expression and the accuracy of detection. Experimental results show that on the UCF-Crime dataset, the accuracy of this method reaches 85.76%, which is significantly better than that of traditional methods. In addition, on the larger UCF-101 dataset, the accuracy of this method reaches 92.21%, further verifying its excellent generalization performance. Compared with the C3D network, this method improves the accuracy by nearly 6% while maintaining a high processing speed. These results show that the proposed method has great potential in practical applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Monitoring financial healthcare protection in Brazil: evolution, inequalities, and associated factors.
- Author
-
Torres, T.F., Santos, P.H.A., Russo, L.X., and Silva, E.N.
- Subjects
- *
BIOLOGICAL evolution , *MEDICAL care use , *HEALTH services accessibility , *HEALTH insurance , *MEDICAL care , *FOOD security , *HEALTH policy , *TIME series analysis , *DESCRIPTIVE statistics , *CATASTROPHIC illness , *DISEASE prevalence , *SURVEYS , *RURAL conditions , *HEALTH equity , *MEDICAL care costs , *SOCIAL classes - Abstract
Although catastrophic health spending is the main measure for assessing financial healthcare protection, it varies considerably in methodological and empirical terms, which hinders comparison between studies. The aim of this study was to measure the prevalence of catastrophic health spending in Brazil in 2003, 2009, and 2018, its associated factors, and disparities in prevalence distribution according to socioeconomic status. This was a time series study. Data from the Household Budget Surveys were used. Prevalence of catastrophic health spending was measured as a percentage of the budget and ability to pay, considering thresholds of 10, 25, and 40%. It was determined whether household, family, and household head characteristics influence the likelihood of incurring catastrophic health spending. Households were stratified by income deciles, consumption, and wealth score. There was an increase in prevalence of catastrophic health spending between 2003 and 2009 in Brazil and a slight reduction in 2018. The wealth score showed more pronounced distributional effects between the poor and the rich, with the former being the most affected by catastrophic health spending. Consumption showed greater percentage variations in the prevalence of catastrophic health spending. The prevalence of catastrophic health spending was positively associated with the presence of older adults, age and female household head, rural area, receipt of government benefits, and some degree of food insecurity. The poorest families are most affected by catastrophic health spending in Brazil, requiring more effective and equitable policies to mitigate financial risk. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Effect of indoor residual spraying on sandfly abundance and incidence of visceral leishmaniasis in India, 2016–22: an interrupted time-series analysis and modelling study.
- Author
-
Coffeng, Luc E, de Vlas, Sake J, Singh, Rudra Pratap, James, Ananthu, Bindroo, Joy, Sharma, Niteen K, Ali, Asgar, Singh, Chandramani, Sharma, Sadhana, and Coleman, Michael
- Subjects
- *
VISCERAL leishmaniasis , *MANAGEMENT information systems , *TIME series analysis , *VECTOR data , *INFORMATION resources management - Abstract
Efforts to eliminate visceral leishmaniasis in India mainly consist of early detection and treatment of cases and indoor residual spraying with insecticides to kill the phlebotomine sandfly Phlebotomus argentipes that transmits the causative Leishmania protozoa. In this modelling study, we aimed to estimate the effect of indoor residual spraying (IRS) on vector abundance and transmission of visceral leishmaniasis in India. In this time-series analysis and modelling study, we assessed the effect of IRS on vector abundance by using indoor vector-abundance data (from 2016 to 2022) and IRS quality-assurance data (from 2017–20) from 50 villages in eight endemic blocks in India where IRS was implemented programmatically. To assess a potential dose–response relation between insecticide concentrations and changes in sandfly abundance, we examined the correlation between site-level insecticide concentrations and the site-level data for monthly sandfly abundances. We used mathematical modelling to link vector data to visceral leishmaniasis case numbers from the national Kala-Azar Management Information System registry (2013–21), and to predict the effect of IRS on numbers of averted cases and deaths. IRS was estimated to reduce indoor sandfly abundance by 27% (95% CI 20–34). Concentrations of insecticides on walls were significantly—but weakly—associated with the degree of reduction in vector abundance, with a reduction of –0·0023 (95% CI –0·0040 to –0·0007) sandflies per mg/m2 insecticide (p=0·0057). Reported case numbers of visceral leishmaniasis were well explained by trends in vector abundance. Village-wide IRS in response to a newly detected case of visceral leishmaniasis was predicted to reduce disease incidence by 6–40% depending on the presumed reduction in vector abundance modelled. Indoor residual spraying has substantially reduced sandfly abundance in India, which has contributed to reductions in visceral leishmaniasis and related deaths. To prevent the re-emergence of visceral leishmaniasis as a public health problem, surveillance of transmission and sandfly abundance is warranted. Bill & Melinda Gates Foundation. For the Hindi translation of the abstract see Supplementary Materials section. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Time Series Analysis: Associations Between Temperature and Primary Care Utilization in Philadelphia, Pennsylvania.
- Author
-
Fitzpatrick, Janet H., Willard, Adrienne, Edwards, Janelle R., Harhay, Meera N., Schinasi, Leah H., Matthews, Janet, and May, Nathalie
- Subjects
- *
EARTH temperature , *EXTREME weather , *TIME series analysis , *MEDICAL screening , *PRIMARY care - Abstract
Earth's temperature has risen by an average of 0.11°F per decade since 1850 and experts predict continued global warming. Studies have shown that exposure to extreme temperatures is associated with adverse health outcomes. Missed primary care visits can lead to incomplete preventive health screenings and unmanaged chronic diseases. This study examines the associations between extreme temperature conditions and primary care utilization among adult Philadelphians. A total of 1,048,575 appointments from 91,580 patients age ≥ 18 years enrolled in the study at thirteen university-based outpatient clinics in Philadelphia from January 1, 2009 to December 31, 2019. Statistical analysis was performed from June to December 2023. Data on attended and missed appointments was linked with measures of daily maximum temperature and precipitation, stratified by warm and cold seasons. Sociodemographic variables and associations with chronic disease status were explored. Rates of missed appointments increased by 0.72% for every 1°F decrease in daily maximum temperatures below 39°F and increased by 0.64% for every 1°F increase above 89°F. Individuals ≥ 65 years and those with chronic conditions had stronger associations with an increased rate of missed appointments. Temperature extremes were associated with higher rates of missed primary care appointments. Individuals with chronic diseases were more likely to have missed appointments associated with extreme temperatures. The findings suggest the need for primary care physicians to explore different modes of care delivery to support vulnerable populations, such as making telemedicine during extreme weather events a viable and affordable option. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Seasonally inundated area extraction based on long time-series surface water dynamics for improved flood mapping.
- Author
-
Zhao, Bingyu, Wu, Jianjun, Chen, Meng, Lin, Jingyu, and Du, Ruohua
- Subjects
- *
EMERGENCY management , *HARMONIC maps , *SURFACE dynamics , *HARMONIC analysis (Mathematics) , *TIME series analysis , *NATURAL disasters - Abstract
Accurate extraction of Seasonally Inundated Area (SIA) is pivotal for precise delineation of Flood Inundation Area (FIA). Current methods predominantly rely on Water Inundation Frequency (WIF) to extract SIA, which, due to the lack of analysis of dynamic surface water changes, often yields less accurate and robust results. This significantly hampers the rapid and precise mapping of FIA. In the study, based on the Harmonic Models constructed from Long Time-series Surface Water (LTSW) dynamics, an SIA extraction approach (SHM) was introduced to enhance their accuracy and robustness, thereby improving flood mapping. The experiments were conducted in Poyang Lake, a region characterized by active hydrological phenomena. Sentinel-1/2 remote sensing data were utilized to extract LTSW. Harmonic analysis was applied to the LTSW dataset, using the amplitude terms in the harmonic model to characterise the frequency of variation between land and water for the surface units, thus extracting the SIAs. The results reveal that the harmonic model parameters are capable of portraying SIA. In comparison to the commonly used WIF thresholding method for SIA extraction, the SHM approach demonstrates superior accuracy and robustness. Leveraging the SIA extracted through SHM, a higher level of accuracy in FIA extraction is achieved. Overall, the SHM offers notable advantages, including high accuracy, automation, and robustness. It offers reliable reference water extents for flood mapping, especially in areas with active and complex hydrological dynamics. SHM can play a crucial role in emergency response to flood disasters, providing essential technical support for natural disaster management and related departments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Utilizing correlation in space and time: Anomaly detection for Industrial Internet of Things (IIoT) via spatiotemporal gated graph attention network.
- Author
-
Fan, Yuxin, Fu, Tingting, Listopad, Nikolai Izmailovich, Liu, Peng, Garg, Sahil, and Hassan, Mohammad Mehedi
- Subjects
CONVOLUTIONAL neural networks ,GRAPH neural networks ,ANOMALY detection (Computer security) ,INTERNET of things ,TIME series analysis - Abstract
The Industrial Internet of Things (IIoT) infrastructure is inherently complex, often involving a multitude of sensors and devices. Ensuring the secure operation and maintenance of these systems is increasingly critical, making anomaly detection a vital tool for guaranteeing the success of IIoT deployments. In light of the distinctive features of the IIoT, graph-based anomaly detection emerges as a method with great potential. However, traditional graph neural networks, such as Graph Convolutional Networks (GCNs) and Graph Attention Networks (GATs), have certain limitations and significant room for improvement. Moreover, previous anomaly detection methods based on graph neural networks have focused only on capturing dependencies in the spatial dimension, lacking the ability to capture dynamics in the temporal dimension. To address these shortcomings, we propose an anomaly detection method based on Spatio-Temporal Gated Attention Networks (STGaAN). STGaAN learns a graph structure representing the dependencies among sensors and then utilizes gated graph attention networks and temporal convolutional networks to grasp the spatio-temporal connections in time series data of sensors. Furthermore, STGaAN optimizes the results jointly based on both reconstruction and prediction loss functions. Experiments on public datasets indicate that STGaAN performs better than other advanced baselines. We also visualize the learned graph structures to provide insights into the effectiveness of graph-level anomaly detection. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. The g-C3N4/rGO composite for high-performance supercapacitor: Synthesis, characterizations, and time series modeling and predictions.
- Author
-
Sonkawade, Aniket R., Mahajan, Sumedh S., Shelake, Anjali R., Ahir, Shubham A., Waikar, Maqsood R., Sutar, Santosh S., Sonkawade, Rajendra G., and Dongale, Tukaram D.
- Subjects
- *
CARBON-based materials , *ENERGY density , *TIME series analysis , *ENERGY storage , *FOURIER transform infrared spectroscopy , *SUPERCAPACITOR electrodes - Abstract
Supercapacitors have gathered significant interest in addressing the increasing need for energy storage devices with high power and energy densities. This study introduces a composite framework designed to enhance supercapacitor performance by leveraging the synergistic effects of g-C 3 N 4 /rGO composite. The composite electrode, which combines rGO for high conductivity and g-C 3 N 4 for rapid ion diffusion, demonstrates excellent supercapacitor performance. The synthesized materials were characterized using different analytical tools such as X-ray diffraction, Fourier transform infrared spectroscopy, Brunauer-Emmett-Teller, Scanning and Transmission electron microscope, and X-ray photoelectron spectroscopy. The composite electrode exhibited outstanding performance characteristics, including a specific capacitance of 407.7 F/g, an energy density of 11.4 W h/kg, and a high power density of 186.7 W/kg. Kinetic analysis revealed that the g-C 3 N 4 /rGO composite electrode has a dominant diffusive controlled process. The g-C 3 N 4 /rGO electrode also demonstrated exceptional durability, maintaining remarkable capacitance retention even after the 5000th charge/discharge cycles. Moreover, the cyclic s tability and Coulombic efficiency of the g-C 3 N 4 /rGO electrode were modeled and predicted by utilizing the time series analysis technique (Holt-Winters exponential smoothing). Considering the overall results, the g-C 3 N 4 /rGO composite electrode has a great potential for energy storage applications. [Display omitted] • g-C 3 N 4 /rGO composite shows good specific capacitance (407.7 F/g). • Demonstrate good energy density (11.4 W h/kg) and power density (186.7 W/kg). • Composite shows good stability (83%) and retains the 89% Coulombic efficiency. • g-C 3 N 4 /rGO supercapacitor can be stable up to 5000 cycles. • Cyclic data of composite modeled and predicted using time series analysis method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. Are we crossing a minimum of the Gleissberg centennial cycle? Multivariate machine learning-based prediction of the sunspot number using different proxies of solar activity and spectral analysis.
- Author
-
Rodríguez, José-Víctor, Sánchez Carrasco, Víctor Manuel, Rodríguez-Rodríguez, Ignacio, Pérez Aparicio, Alejandro Jesús, and Vaquero, José Manuel
- Subjects
- *
SOLAR cycle , *SOLAR activity , *STANDARD deviations , *TIME series analysis , *SUNSPOTS , *FAST Fourier transforms - Abstract
We propose a new method for predicting the solar cycle in terms of the sunspot number (S N) based on multivariate machine learning algorithms, various proxies of solar activity, and the spectral analysis of all considered time series via the fast Fourier transform (through the latter we identify periodicities with which to lag these series and thus generate new attributes –predictors– for incorporation in the prediction model). This combination of three different techniques in a single method is expected to enhance the accuracy and reliability of the solar activity prediction models developed to date. Thus, predictive results for S N are presented for Solar Cycles 25 (the current one) and 26 (using the 13-month smoothed S N , version 2) up until January 2038, yielding maximum values of 134.2 (in June 2024) and 115.4 (in May 2034), respectively, with a root mean squared error (RMSE) of 9.8. These results imply, on the one hand, a maximum of Cycle 25 below the average and, on the other hand, a lower peak than the preceding ones for Cycle 26, suggesting that Solar Cycles 24, 25, and 26 are part of a minimum of the centennial Gleissberg cycle, as occurred with Cycles 12, 13, and 14 in the final years of the 19th century and the early 20th century. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Wavelet-based correlations of the global magnetic field in connection to strongest earthquakes.
- Author
-
Lyubushin, Alexey and Rodionov, Eugeny
- Subjects
- *
GEOMAGNETISM , *MAGNETIC flux density , *EARTHQUAKE magnitude , *TIME series analysis , *VECTOR fields - Abstract
We consider 3-component records of the magnetic field strength with a time step of 1 min at 153 stations of the INTERMAGNET network for 31 years, 1991–2021. Data analysis is based on the calculation of pairwise correlation coefficients between wavelet coefficients in successive time windows 1 day long (1440 min counts). To describe the state of the magnetic field, the maxima of the average values of all pairwise correlation coefficients between stations were chosen, calculated over all detail levels of the wavelet decomposition and over all components of the magnetic field strength vector. The daily time series of such maxima is called wavelet correlation. The division of the network stations into 7 clusters is considered, and a time series of wavelet correlations is calculated for each cluster. In a sliding time window with a length of 365 days, correlation measures of synchronization of wavelet correlations from different clusters are calculated, which are compared with the strongest earthquakes with a magnitude of at least 8.5. For the global time series of wavelet correlations, the method of influence matrices is used to study the relationship between the maximum correlation responses to a change in the length of the day and a sequence of earthquakes with a magnitude of at least 7. As a result of the analysis, precursor effects are identified, and the important role of the Maule earthquake in Chile on February 27, 2010 in the behavior of the response of magnetic field for the preparation of strong seismic events is shown. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
31. Regression-based Model for Predicting Simulated vs Actual Building Performance Discrepancies.
- Author
-
Izonin, Ivan, Tkachenko, Roman, Caro, Rosana, LaTorre de la Fuente, Antonio, Yemets, Kyrylo, and Mitoulis, Stergios Aristoteles
- Subjects
RANDOM forest algorithms ,DIGITAL twins ,BUILDING performance ,TIME series analysis ,REGRESSION analysis - Abstract
Accurately predicting discrepancies between simulated and actual building performance is becoming increasingly crucial in building management and optimization of e.g., Digital Twins and energy efficiency assessments. This challenge is amplified by the growing dependence on simulation models which predict building behavior, energy consumption, and operational efficiency. Despite advancements in simulation technology, aligning these models with real-world data remains a persistent challenge. This study addresses this challenge by developing a regression-based model designed to predict discrepancies between simulated and actual operational characteristics of buildings. The model identifies differences between synchronized time series data to generate a new series that highlights these discrepancies. By employing a sliding window technique, the model processes actual operational data to predict discrepancies. They implemented this approach using AdaBoost and Random Forest, evaluating performance across eight datasets of Indoor Air Temperature from two cities. The results demonstrate that the Random Forest algorithm significantly outperforms AdaBoost, with improvements in R² scores of up to 23 percent and reductions in RMSE by up to 4.5 times. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Energy Consumption Prediction System based on clustering techniques.
- Author
-
Estrada, Rebeca, Farinango, Pedro, Santana, Kevin, and Asanza, Victor
- Subjects
ENERGY consumption of buildings ,STANDARD deviations ,TIME series analysis ,ENERGY consumption ,RANDOM forest algorithms - Abstract
The ability to predict energy consumption in buildings and homes is becoming an increasingly crucial aspect of energy management. This ability could help the environment and the economy by making buildings more energy efficient and reducing operational costs. This paper proposes a system for predicting energy consumption for non-residential buildings. It compares traditional machine learning (ML) models and clustering techniques such as K-means, DBSCAN and Hierarchical. The goal is to improve energy consumption predictions by analyzing energy consumption data. These features are used to group similar buildings together. We evaluate two representative time series for each cluster: an average and the time series of the building close to the centroid. ML models such as Linear Regression (LR), Step-wise LR, Tree, SVM, Efficient LR, Ensemble and Random Forest were evaluated for the two representative time series to identify the best one. Numerical results demonstrate that the Random Forest model delivered the lowest root mean square error (RMSE) of the predicted variable for each cluster, with the exception of cluster 3, where the ensemble ML model exhibited the lowest error. Finally, when evaluating the prediction based on the selected models for the rest of the time series belonging to a cluster, the average time series has the lowest RMSE compared to the time series of the building closest to the centroid of the cluster. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. Dynamic obstacle avoidance model of autonomous driving with attention mechanism and temporal residual block.
- Author
-
Chi, Xinrui, Guo, Zhanbin, and Cheng, Fu
- Subjects
DEEP learning ,FEATURE extraction ,TIME series analysis ,PYRAMIDS ,OBJECT tracking (Computer vision) ,ALGORITHMS - Abstract
Dynamic obstacle avoidance is crucial in autonomous driving, ensuring vehicle safety by preventing collisions and enhancing driving efficiency. Dynamic obstacle avoidance algorithms have made significant progress due to deep learning. However, video-based target detection methods can suffer from missed or false detections when processing consecutive frames, especially for high-speed moving targets or in complex dynamic scenes. Multi-target tracking methods require intricate algorithm designs for target initialization and occluded object recovery, which can be compromised by tracker performance, leading to unstable tracking or target loss. To address the issue of target loss in multi-target tracking, we designed the novel YTCN model that infuses time series information through temporal convolution and enhances the sensitivity of the receptive field with Spatial Pyramid Pooling, Feature Concatenate and Spatial Convolution(SPPFCSPC), enhancing the model's feature extraction capability. Simultaneously, we developed the novel Global Attention Mechanism(GAM) and Double Attention(DA) mechanisms that merge channel and spatial features to enhance feature representation. Finally we design the Temporal Residual Block(TRB) to model the temporal obstacles. Experimental results show that our method achieves 82.4 % of mAP_0.5 on the BDD 100 K dataset, which gets 1.3 % higher than the previous methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. A novel HMM distance measure with state alignment.
- Author
-
Yang, Nan, Leung, Cheuk Hang, and Yan, Xing
- Subjects
- *
HIDDEN Markov models , *TIME series analysis , *DENSITY of states - Abstract
In this paper, we introduce a novel distance measure that conforms to the definition of a semi-distance, for quantifying the similarity between Hidden Markov Models (HMMs). This distance measure is not only easier to implement, but also accounts for state alignment before distance calculation, ensuring correctness and accuracy. Our proposed distance measure presents a significant advancement in HMM comparison, offering a more practical and accurate solution compared to existing measures. Numerical examples that demonstrate the utility of the proposed distance measure are given for HMMs with continuous state probability densities. In real-world data experiments, we employ HMM to represent the evolution of financial time series or music. Subsequently, leveraging the proposed distance measure, we conduct HMM-based unsupervised clustering, demonstrating promising results. Our approach proves effective in capturing the inherent difference in dynamics of financial time series, showcasing the practicality and success of the proposed distance measure. • We introduce a novel semi-distance measure for comparing two HMMs. • Our measures are easy to compute, with a pivotal state alignment step. • They lead to meaningful HMM comparison and subsequent practical clustering. • We capture intricate dynamics of financial time series through clustering. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Expanding access to high-quality early care and education for families with low income in Maryland through child care subsidy policies.
- Author
-
Halle, Tamara, Tang, Jing, Maxfield, Emily Theresa, Gerson, Cassandra Simons, Verhoye, Alexandra, Madill, Rebecca, Piña, Gabriel, Gottesman, Patti Banghart, Solomon, Bonnie, Caballero-Acosta, Sage, Lin, Ying-Chun, Fuller, James, and Kelley, Sarah
- Subjects
- *
POOR families , *TIME series analysis , *SPECIFIC gravity , *INCOME , *CHILD care - Abstract
• Maryland enacted several policies to increase access to quality ECE from January 2015–March 2020. • Provider subsidy participation grew following elevated income limits and reimbursement rates. • Neighborhoods with lower-poverty density had greater increases in provider subsidy participation. • The percent of children using subsidies to attend higher-rated programs increased between 2018 and 2019. • Increased use of higher-rated ECE over time was not due to families changing providers. Documenting how federal and state child care policies increase equitable access to high-quality early care and education (ECE) for families with low- and moderate-incomes remains a challenge in part due to overlaps in policy enactment. This study used an interrupted time series analysis (ITSA) to describe changes to providers' participation in Maryland's child care subsidy program following implementation of a constellation of child care policies enacted between January 5, 2015, and March 2, 2020 (i.e., prior to the COVID-19 pandemic). Findings indicate a marked increase in the percentage of licensed family child care (FCC) and center-based providers serving children with a subsidy following increases in household income eligibility levels and provider reimbursement rates in 2018. Provider participation rates varied by neighborhood income level, with participation expanding more in neighborhoods with lower poverty density relative to their starting level in 2015. Changes in child participation rates by income eligibility mirrored changes in state subsidy policy: children residing in income-eligible households above 200 % federal poverty level represented 4.4 % of the child sample in 2018, 13 % in 2019, and 18 % in 2020. The proportion of children with a subsidy who used higher-rated ECE increased significantly between January 2018 and January 2020 for all racial/ethnic groups, income eligibility levels, and urbanicity categories. The majority (62 %) of children who stayed in the subsidy program between 2018 and 2019 stayed with their same provider, many of which obtained their first rating or increased their quality rating during this time frame in accordance with a new requirement for providers to participate in the state's quality rating system to receive a subsidy reimbursement. Implications for future research, policy, and practice are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
36. Positive changes in breastfeeding and complementary feeding indicators in Brazil are associated with favorable nurturing care environments.
- Author
-
Salviano, A.F., Guedes, B.M., Carioca, A.A.F., Venancio, S.I., Buccini, G., and Lourenço, B.H.
- Subjects
- *
BREASTFEEDING , *INFANTS , *FOOD consumption , *PRIMARY health care , *NUTRITIONAL requirements , *TIME series analysis , *DESCRIPTIVE statistics , *CHILD development , *NURTURING behavior , *CHILDREN - Abstract
To analyze trends in breastfeeding and complementary feeding indicators for infants and young children receiving primary health care (PHC) services in Brazil, considering the contextual aspects of local nurturing care (NC) environments. Ecological time-series study. Ten feeding indicators were extracted from 1,055,907 food intake records of children aged <2 years reported by PHC facilities from 2015 to 2019. Local NC environments were assessed with the Brazilian Early Childhood Friendly Municipal Index, calculating overall and stratified scores for the NC domains of adequate nutrition, good health, opportunities for early learning, and security and safety. Prais–Winsten regression was used to calculate annual percent changes (APC) by sex and the contrast in APC between the lower and upper quintiles of NC scores. Positive or negative APC with P -values <0.05 represented increasing or decreasing trends. No significant trends of exclusive and continued breastfeeding, food introduction, or minimum dietary diversity were observed, with 2019 prevalences of 54.5%, 45.2%, 92.5%, and 78.2%, respectively. Increasing trends were observed for mixed milk feeding (2019: 19.2%; APC, +2.42%) and minimum meal frequency (2019: 61.1%; APC, +2.56%), while decreasing trends were observed for sweet beverage consumption (2019: 31.9%; APC, −5.92%) and unhealthy foods (2019: 16.1%; APC, −4.69%). Indicator improvements were significantly stronger in environments more favorable for NC. Although the indicators did not meet global targets for infant feeding practices, the results suggest that the local NC environment encompasses facilitators that may be strategic in the design of early childhood programs and policies to improve nutrition. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
37. Negative binomial community network vector autoregression for multivariate integer-valued time series.
- Author
-
Guo, Xiangyu and Zhu, Fukang
- Subjects
- *
NEGATIVE binomial distribution , *MARGINAL distributions , *TIME series analysis , *HETEROGENEITY - Abstract
Modeling multivariate integer-valued time series with appropriate methods is currently a popular research topic. In this paper, we propose a multivariate integer-valued autoregressive time series model based on a fixed network community structure. We use the negative binomial distribution as the conditional marginal distribution and a copula to construct the conditional joint distribution. The newly proposed model introduces the heterogeneity of nodes. Stability conditions are provided for both fixed and increasing dimensions. We estimate the parameters of the proposed model by maximizing the quasi-likelihood function with known and unknown community membership matrices, respectively. Corresponding asymptotic properties of parameter estimates are also provided. A simulation study is conducted to demonstrate the asymptotic behavior of the proposed model, and two real datasets are employed to compare the proposed model with other competitive models. • Model multivariate integer-valued time series with suitable methods is hot. • A model with network community structure is proposed, which has node heterogeneity. • Stability conditions for both fixed and increasing dimensions are provided. • Parameters are estimated with known and unknown community membership matrices. • Two real datasets are used for illustrating flexibility. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Depressive symptoms and suicidal ideation among university students before and after the COVID-19 pandemic.
- Author
-
Macalli, Mélissa, Castel, Laura, Jacqmin-Gadda, Hélène, Galesne, Charline, Tournier, Marie, Galéra, Cédric, Pereira, Edwige, and Tzourio, Christophe
- Subjects
- *
MENTAL health of students , *SUICIDAL ideation , *TIME series analysis , *ATTEMPTED suicide , *MENTAL depression - Abstract
The COVID-19 pandemic and lockdown have had negative effects on students' mental health. However, little information is available regarding the frequencies of depressive symptoms and suicidal ideation during the post-pandemic period. We aimed to determine the effect of the COVID-19 pandemic on depressive symptoms and suicidal ideation among French university students. In this comparative study, 4463 students were recruited during the pre-COVID-19 pandemic period (2013-2020) and 1768 students, during the post-COVID-19 pandemic period (2022-2023). Standardized frequencies of depressive symptoms and suicidal ideation were compared between the two time periods. Changes in the level of depressive symptoms and suicidal ideation between the pre- and post-pandemic periods, were then analyzed using interrupted time series analysis. Compared to participants from the pre-pandemic sample, participants from the post-pandemic sample had higher standardized rates of depressive symptoms (40.6 % vs 25.6 %) and suicidal ideation (29.3 % vs 21.1 %). Segmented logistic regression showed an about 50 % increased risk of depressive symptoms (aOR, 1.47; 95 % CI, 1.01–2.13) and a 100 % increased risk of suicidal ideation (aOR, 2.00; 95 % CI, 1.33–3.00) in the post-pandemic period. Before the pandemic, there was no significant time-trend for depressive symptoms (aOR, 1.002; 95 % CI, 0.999–1.006) and suicidal thoughts (0.999–1.006; aOR, 0.999; 95 % CI, 0.995–1.002). Potential biases related to self-selection of participants in the study and information bias. History of depression and suicide attempt were self-reported. These findings reveal an alarming deterioration of students' mental health in the post-pandemic period compared to the pre-pandemic era. • Students in the post-era had higher frequencies of depression and suicidal ideation • Interrupted time series analyses revealed a change in the level of depressive symptoms and suicidal ideation in the post-pandemic period • These findings suggest an alarming deterioration of students' mental health [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
39. A Cloud-native Approach for Processing of Crowdsourced GNSS Observations and Machine Learning at Scale: A Case Study from the CAMALIOT Project.
- Author
-
Kłopotek, Grzegorz, Pan, Yuanxin, Sturn, Tobias, Weinacker, Rudi, See, Linda, Crocetti, Laura, Awadaljeed, Mudathir, Rothacher, Markus, McCallum, Ian, Fritz, Steffen, Navarro, Vicente, and Soja, Benedikt
- Subjects
- *
GLOBAL Positioning System , *KALMAN filtering , *MULTISENSOR data fusion , *TIME series analysis , *ELECTRONIC data processing , *MACHINE learning , *SMARTPHONES - Abstract
The era of modern smartphones, running on Android version 7.0 and higher, facilitates nowadays acquisition of raw dual-frequency multi-constellation GNSS observations. This paves the way for GNSS community data to be potentially exploited for precise positioning, GNSS reflectometry or geoscience applications at large. The continuously expanding global GNSS infrastructure along with the enormous volume of prospective GNSS community data bring, however, major challenges related to data acquisition, its storage, and subsequent processing for deriving various parameters of interest. In addition, such large datasets cannot be managed manually anymore, leading thus to the need for fully automated and sophisticated data processing pipelines. Application of Machine Learning Technology for GNSS IoT data fusion (CAMALIOT) was an ESA NAVISP Element 1 project (NAVISP-EL1-038.2) with activities aiming to address the aforementioned points related to GNSS community data and their exploitation for scientific applications with the use of Machine Learning (ML). This contribution provides an overview of the CAMALIOT project with information on the designed and implemented cloud-native software for GNSS processing and ML at scale, developed Android application for retrieving GNSS observations from the modern generation of smartphones through dedicated crowdsourcing campaigns, related data ingestion and processing, and GNSS analysis concerning both conventional and smartphone GNSS observations. With the use of the developed GNSS engine employing an Extended Kalman Filter, example processing results related to the Zenith Total Delay (ZTD) and Slant Total Electron Content (STEC) are provided based on the analysis of observations collected with geodetic-grade GNSS receivers and from local measurement sessions involving Xiaomi Mi 8 that collected GNSS observations using the developed Android application. For smartphone observations, ZTD is derived in a differential manner based on a single-frequency double-difference approach employing GPS and Galileo observations, whereas satellite-specific STEC time series are obtained through carrier-to-code leveling based on the geometry-free linear combination of observations from both GPS and Galileo constellations. Although the ZTD and STEC time series from smartphones were derived on a demonstration basis, a rather good level of consistency of such estimates with respect to the reference time series was found. For the considered periods, the RMS of differences between the derived smartphone-based time series of differential zenith wet delay and reference values were below 3.1 mm. In terms of satellite-specific STEC time series expressed with respect to the reference STEC time series, RMS of the offset-reduced differences below 1.2 TECU was found. Smartphone-based observations require special attention including additional processing steps and a dedicated parameterization in order to be able to acquire reliable atmospheric estimates. Although with lower measurement quality compared to traditional sources of GNSS data, an augmentation of ground-based networks of fixed high-end GNSS receivers with GNSS-capable smartphones would however, form an interesting source of complementary information for various studies relying on GNSS observations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. A satellite orbit maneuver detection and robust multipath mitigation method for GPS coordinate time series.
- Author
-
Zhou, Houxiang, Wang, Xiaoya, and Zhong, Shengjian
- Subjects
- *
TIME series analysis , *GLOBAL Positioning System , *ORBITS (Astronomy) , *ORBITS of artificial satellites - Abstract
Multipath effect is the main factor limiting the high-precision positioning of Global Navigation Satellite System (GNSS) because it cannot be eliminated by double-difference observations or existing empirical models. Although the receiver antenna technology can reduce the multipath effect, it also brings the cost burden and cannot effectively solve the short-delay multipath error of carrier phase observation. Sidereal filtering (SF) is a common method used to mitigate multipath effect in static positioning mode. However, the coordinate time series may be contaminated by outliers, and satellite orbits may be affected by maneuvers. These will lead to inaccurate estimation of the Multipath Repetition Period (MRP), and even deteriorate the multipath mitigation effect of the SF. To solve these problems, a Sidereal Filtering method with the Satellite Maneuver Detection and Robustness (SFSMDR) is proposed. In this method, based on the Orbital Repeat Time Method (ORTM), the Multipath Repetition Period with the Satellite Maneuvers Detection (MRPSMD) is determined. Furthermore, considering the robustness of median filtering to outliers, the coordinate time series of the first day were preprocessed by the median filtering. After applying the proposed SFSMDR method, experimental results show that the proposed SFSMDR can effectively detect satellite maneuvers. In the absence of satellite maneuvers, there is no need to re-estimate MRP, which can save computational cost to a certain extent. When the coordinate time series on the first day is contaminated by outliers, the positioning accuracy of the MRPSMD for the east and north directions, is decreased by 2.26% and 95.73%, respectively. In contrast, the positioning accuracy of the SFSMDR is increased by 46.21% and 52.71%, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. Assessment of GNSS stations using atmospheric horizontal gradients and microwave radiometry.
- Author
-
Elgered, Gunnar, Ning, Tong, Diamantidis, Periklis-Konstantinos, and Nilsson, Tobias
- Subjects
- *
MICROWAVE radiometry , *GLOBAL Positioning System , *MICROWAVE radiometers , *MICROWAVE antennas , *MICROWAVE materials , *TIME series analysis - Abstract
We have assessed the quality of four co-located GNSS stations by studying time series of estimated linear horizontal gradients in the signal delay. The stations have different electromagnetic environments. We also examine the consistency of the results by using two different GNSS softwares, GipsyX and c5++, and applying three different elevation cutoff angles: 5 ° , 10 ° , and 20 °. The estimated gradients are compared with the corresponding ones estimated from microwave radiometer observations acquired during six months (April–September 2021). For all four stations and using both softwares we find that is is possible to track gradient variations over time scales from less than one hour using GPS observations only. We have indications that it is an advantage to equip the area below the GNSS antenna with microwave absorbing material. However, the differences are small, a reduction in rms differences in the gradients compared to those from the microwave radiometer of less than 2 %. More studies are needed to decide if such an investment is reasonable in terms of cost and maintenance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. Indoor hydrogen dispersion with stratified filling: Can non-dimensional parameters relate dispersion characteristics across diverse applications?
- Author
-
Vanlaere, Joren, Hendrick, Patrick, and Blondeau, Julien
- Subjects
- *
TIME series analysis , *ATMOSPHERIC density , *MOLE fraction , *WEATHER , *FOSSIL fuels - Abstract
Hydrogen, with its unique flammability characteristics, demands additional consideration due to its broader flammable range compared to fossil fuels. Its low density at atmospheric conditions results in significant buoyancy, mitigating risks in outdoor applications. In confined spaces, hydrogen releases can lead to flammable cloud formation. To study this problem, independent of its dimensions, a dimensional analysis is introduced based on Buckingham's Π -theorem. This work focuses on thirteen functional parameters, including mole fraction, time, release velocity, orifice diameter, reduced gravity, molecular mass diffusion, viscosity and geometric dimensions. Four dimensional scenarios are simulated and compared in this study, to assess the adequacy of the proposed non-dimensional approach. The setup involves a parallelepiped enclosure with a single release point. RANS simulations are conducted. The proposed non-dimensional formulation proves valuable for interpreting data and discussing practical applications. The study contributes to a deeper understanding of hydrogen filling regimes in confined spaces, offering insights for safety assessments and risk mitigation strategies. [Display omitted] • Functional parameters for hydrogen distribution are proposed. • Application of Buckingham's Π -theorem yields essential non-dimensional ratios. • RANS CFD-model used to simulate and discuss four similar scenarios. • Time series analysis of non-dimensional flammable volume reveals similarity. • Ratios and non-dimensional flammable volume enable dimension-independent safety analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Geographical proximity and technological similarity.
- Author
-
Haddad, Eduardo A., Araújo, Inácio F., and Perobelli, Fernando S.
- Subjects
- *
FINANCIAL crises , *TIME series analysis , *INPUT-output analysis , *ECONOMIC structure , *LOGICAL prediction - Abstract
• Compare a time series of input-output coefficients for 66 different countries. • Assess the effects of geographical proximity on technological convergence over time. • Closer economies tend to be more similar due to geographical tech spillovers. • Institutional proximity also matters for technological convergence. • Over time, closer economies are becoming structurally more similar. From a time-space perspective, we assess the effects of geographical proximity on technological convergence over time identifying proximity dimensions associated with countries' technological similarities. We compare a time series of input-output coefficients for 66 different countries extracted from the 2021 edition of OECD Inter-Country Input-Output to verify whether nearby countries are more likely to share similar technologies. Our results reveal that geographical technological spillovers are important since closer economies tend to be more similar than distant ones. This is particularly evident for the European economies in the sample, suggesting that institutional proximity also matters for technological convergence. Over time, closer economies are becoming structurally more similar; however, this trend seems to have slowed down after the 2008–9 financial crisis. Conjectures on how informational gaps are filled in the consolidation of the databases – encountered in an environment of limited information – based on known practices of using regional and global average structures may add a layer of uncertainty to our results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. Coherence bias mitigation through regularized tapered coherence matrix for phase linking in decorrelated environments.
- Author
-
Liang, Hongyu, Zhang, Lei, Li, Xin, and Wu, Jicang
- Subjects
- *
SYNTHETIC aperture radar , *SOURCE code , *TIME series analysis , *SUPPLY chain management , *SAMPLE size (Statistics) - Abstract
Phase linking technique has shown the ability to mitigate the decorrelation effect on the time series interferometric synthetic aperture radar (InSAR) data. By imposing the temporal phase-closure constraint, this technique reconstructs a consistent phase series from the complex sample coherence matrix (SCM). However, the bias of coherence estimates degrades the performance of phase linking, especially in near-zero coherence environments with limited spatial sample support. In this study, we present a methodology to enhance phase linking, with an emphasis on SCM refinement. The incentive behind this is to shrink the tapered SCM towards a scaled identity matrix by exploiting the inner correlation and coherence loss trend in SCM. This allows debiasing the SCM magnitude even in the presence of small sample size. We demonstrate the performance of this method by simulations and real case studies using Sentinel-1 data over Hawaii island. Results from comprehensive comparisons validate the effectiveness of coherence matrix estimation and the enhancement to phase linking in different coherence scenarios. The source code and sample dataset are available at https://www.mathworks.com/matlabcentral/fileexchange/169553-insar-phase-linking-enhancement-by-scm-refinement. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. A semi-supervised multi-temporal landslide and flash flood event detection methodology for unexplored regions using massive satellite image time series.
- Author
-
Deijns, Axel A.J., Michéa, David, Déprez, Aline, Malet, Jean-Philippe, Kervyn, François, Thiery, Wim, and Dewitte, Olivier
- Subjects
- *
TIME series analysis , *REMOTE-sensing images , *CLOUDINESS , *RANDOM forest algorithms , *SPATIAL behavior , *LANDSLIDES - Abstract
Landslides and flash floods are geomorphic hazards (GH) that often co-occur and interact and frequently lead to societal and environmental impact. The compilation of detailed multi-temporal inventories of GH events over a variety of contrasting natural as well as human-influenced landscapes is essential to understanding their behavior in both space and time and allows to unravel the human drivers from the natural baselines. Yet, creating multi-temporal inventories of these GH events remains difficult and costly in terms of human labor, especially when relatively large regions are investigated. Methods to derive GH location from satellite optical imagery have been continuously developed and have shown a clear shift in recent years from conventional methodologies like thresholding and regression to machine learning (ML) methodologies given their improved predictive performance. However, these current generation ML methodologies generally rely on accurate information on either the GH location (training samples) or the GH timing (pre- and post-event imagery), making them unfit in unexplored regions without a priori information on GH occurrences. Currently, a detection methodology to create multi-temporal GH event inventories applicable in relatively large unexplored areas containing a variety of landscapes does not yet exist. We present a new semi-supervised methodology that allows for the detection of both location and timing of GH event occurrence with optical time series, while minimizing manual user interventions. We use the peak of the cumulative difference to the mean for a multitude of spectral indices derived from open-access, high spatial resolution (10–20 m) Copernicus Sentinel-2 time series and generate a map per Sentinel-2 tile that identifies impacted pixels and their related timing. These maps are used to identify GH event impacted zones. We use the generated maps, the identified GH events impacted zones and the automatically derived timing and use them as training sample in a Random Forest classifier to improve the spatial detection accuracy within the impacted zone. We showcase the methodology on six Sentinel-2 tiles in the tropical East African Rift where we detect 29 GH events between 2016 and 2021. We use 12 of these GH events (totalizing ∼3900 GH features) with varying time of occurrence, contrasting landscape conditions and different landslide to flash flood ratios to validate the detection methodology. The average identified timing of the GH events lies within two to four weeks of their actual occurrence. The sensitivity of the methodology is mainly influenced by the differences in landscapes, the amount of cloud cover and the size of the GH events. Our methodology is applicable in various landscapes, can be run in a systematic mode, and is dependent only on a few parameters. The methodology is adapted for massive computation. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Incremental multi temporal InSAR analysis via recursive sequential estimator for long-term landslide deformation monitoring.
- Author
-
Ao, Meng, Wei, Lianhuan, Liao, Mingsheng, Zhang, Lu, Dong, Jie, and Liu, Shanjun
- Subjects
- *
GLOBAL Positioning System , *TIME series analysis , *DEFORMATION of surfaces , *BATCH processing , *DATA libraries , *LANDSLIDES - Abstract
Distributed Scatterers Interferometry (DS-InSAR) has been widely applied to increase the number of measurement points (MP) in complex mountainous areas with dense vegetation and complicated topography. However, DS-InSAR method adopts batch processing mode. When new observation data acquired, the entire archived data is reprocessed, completely ignoring the existing results, and not suitable for high-performance processing of operational observation data. The current research focuses on the automation of SAR data acquisition and processing optimization, but the core time series analysis method remains unchanged. In this paper, based on the traditional Sequential Estimator proposed by Ansari in 2017, a Recursive Sequential Estimator with Flexible Batches (RSEFB) is improved to divide the large dataset flexibly without requirements on the number of images in each subset. This method updates and processes the newly acquired SAR data in near real-time, and obtains long-time sequence results without reprocessing the entire data archived, helpful to the early warning of landslide disaster in the future. 132 Sentinel-1 SAR images and 44 TerraSAR-X SAR images were utilized to inverse the line of sight (LOS) surface deformation of Xishancun landslide and Huangnibazi landslide in Li County, Sichuan Province, China. RSEFB method is applied to retrieve time-series displacements from Sentinel-1 and TerraSAR-X datasets, respectively. The comparison with the traditional Sequential Estimator and validation through Global Position System (GPS) monitoring data proved the effectiveness and reliability of the RSEFB method. The research shows that Xishancun landslide is in a state of slow and uneven deformation, and the non-sliding part of Huangnibazi landslide has obvious deformation signal, so continuous monitoring is needed to prevent and mitigate possible catastrophic slope failure events. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Explaining the decisions and the functioning of a convolutional spatiotemporal land cover classifier with channel attention and redescription mining.
- Author
-
Pelous, Enzo, Méger, Nicolas, Benoit, Alexandre, Atto, Abdourrahmane, Ienco, Dino, Courteille, Hermann, and Lin-Kwong-Chon, Christophe
- Subjects
- *
CONVOLUTIONAL neural networks , *LAND cover , *REMOTE-sensing images , *TIME series analysis , *ARTIFICIAL intelligence - Abstract
Convolutional neural networks trained with satellite image time series have demonstrated their potential in land cover classification in recent years. Nevertheless, the rationale leading to their decisions remains obscure by nature. Methods for providing relevant and simplified explanations of their decisions as well as methods for understanding their inner functioning have thus emerged. However, both kinds of methods generally work separately and no explicit connection between their findings is made available. This paper presents an innovative method for refining the explanations provided by channel-based attention mechanisms. It consists in identifying correspondence rules between neuronal activation levels and the presence of spatiotemporal patterns in the input data for each channel and target class. These rules provide both class-level and instance-level explanations, as well as an explicit understanding of the network operations. They are extracted using a state-of-the-art redescription mining algorithm. Experiments on the Reunion Island Sentinel-2 dataset show that both correct and incorrect decisions can be explained using convenient spatiotemporal visualizations. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Complex system anomaly detection via learnable temporal-spatial graph with degradation tendency segmentation.
- Author
-
Han, Qinfeng, Chen, Jinglong, Wang, Jun, and Feng, Yong
- Subjects
ANOMALY detection (Computer security) ,RELIABILITY in engineering ,ROCKET engines ,TIME series analysis ,LEARNING strategies - Abstract
To guarantee the safety and reliability of equipment operation, such as liquid rocket engine (LRE), carrying out system-level anomaly detection (AD) is crucial. However, current methods ignore the prior knowledge of mechanical system itself, and seldom unite the observations with the inherent relation in data tightly. Meanwhile, they neglect the weakness and nonindependence of system-level anomaly which is different from component fault. To overcome above limitations, we propose a separate reconstruction framework using worsened tendency for system-level AD. To prevent anomalous feature being attenuated, we first propose to divide single sample into two equal-length parts along the temporal dimension. And we maximize the mean maximum discrepancy (MMD) between feature segments to force encoders to learn normal features with different distributions. Then, to fully explore the multivariate time series, we model temporal-spatial dependence by temporal convolution and graph attention. Besides, a joint graph learning strategy is proposed to handle prior knowledge and data characteristics simultaneously. Finally, the proposed method is evaluated on two real multi-sensor datasets from LRE and the results demonstrate the effectiveness and potential of the proposed method on system-level AD. • A novel neural network based on segmenting and reconstructing temporal-spatial feature for system anomaly detection. • Segmenting operation is designed to overcome the weakness of anomaly, which is simple and universal for time series data. • A joint graph learning strategy and a novel temporal-spatial feature extraction module are proposed for multi-source data. • Experiments on two different real-world datasets are conducted and demonstrated the superiority of proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Identification of climatic tipping points and transitions in Chinese loess grain-size records utilizing nonlinear time series analysis.
- Author
-
Xue, Haozhong, Song, Song, Qiu, Mengfan, Huang, Xiaofang, Yang, Shiling, and Tang, Zihua
- Subjects
- *
ATLANTIC meridional overturning circulation , *GLOBAL environmental change , *CLIMATE change , *TIME series analysis , *LOESS - Abstract
As one of the most important terrestrial sediments, Chinese loess provides valuable information on regional and global climatic and environmental changes and holds great potential for studying on nonlinear behaviors of the East Asian monsoon system. Utilizing objective and quantitative methods to identify tipping points and climate transitions in paleoclimatic records can help us understand the climatic change in the Chinese Loess Plateau (CLP). This study explores critical tipping points and nonlinear climate transitions within the CLP using the Chiloparts record, a comprehensive 2600-ka paleoclimate dataset. We pinpointed potential tipping points using recurrence quantification analysis and the augmented Kolmogorov-Smirnov test, ultimately leading to 15 critical tipping points. We argued that these 15 tipping points represent some of the most significant climatic changes recorded in the Chinese loess paleoclimate record. Employing recurrence quantification analysis, recurrence networks, and visibility graphs, we also identified several climate transitions and provided some nonlinear information, including the Mid-Pleistocene Transition (MPT) as well as the Mid-Brunhes Transition (MBT). We particularly highlight a significant climatic regime transition around 500 ka that may reflect a nonlinear response to variations in the Atlantic Meridional Overturning Circulation (AMOC). Our research also contributes to the understanding of the complex interplay between loess deposition, environmental change, and tectonic activity, emphasizing the need for further investigations to elucidate the mechanisms driving these transitions. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Colored noise in GRACE total water storage time series: Its impact on trend significance in the Türkiye region and major world river basins.
- Author
-
Gunes, Ozge and Aydin, Cuneyt
- Subjects
- *
TIME series analysis , *RANDOM effects model , *STATISTICAL hypothesis testing , *PINK noise , *WATER storage , *NOISE , *WATERSHEDS - Abstract
• Temporal correlations exist in the GRACE TWS time series, resulting in the presence of colored noise. • To show the impact of these correlations, we apply autoregressive noise model and variance component estimation model. • Neglecting these correlations leads to underestimation of trend uncertainty, misinterpreting its significance. Temporal correlations are prevalent in most geophysical time series data, leading to what is known as colored noise, which become more apparent in the frequency domain rather than time domain. Neglecting this type of noise in modeling can result in underestimated standard errors for the parameters of the model used to analyze the time series. As a result, statistical tests misinterpret the significance of these parameters, such as the overall trend rate, because the "signal to noise" ratio becomes incorrectly larger than it should be. In this study, we investigate the temporal correlations in the time series data from the Gravity Recovery and Climate Experiment (GRACE) mission. GRACE and its successor mission GRACE Follow-On (GRACE-FO) have been successfully employed to monitor global total water storage anomalies over two decades since its initial launch in 2002. We primarily utilize monthly GRACE mascon (mass concentration) solutions provided by NASA Goddard Space Flight Center (GSFC), examining both regional and global scales. The power spectrum density of the residuals, calculated after applying standard harmonic regression to each mascon time series on Earth's land, reveals an average spectral index of κ = −1. This indicates the predominance of flicker noise, one of the most common forms of colored noise in geodetic time series. Notably, the noise becomes more Brownian (κ < −1) in regions with intense hydrological events, while it tends to be more white noise (κ → 0) in areas with less water circulation compared to other land regions. This observation suggests that unmodelled hydrological events, such as droughts, floods, and ones associated with ice-melting, contribute to the residuals as colored noise in the time series. To account for this noise in our time series modeling, we primarily consider autoregressive (AR) moving average (MA) process, commonly known as ARMA, in addition to the standard variance component estimation for noise amplitudes. We focus on individual mascon blocks in the Türkiye Region and mascon solutions for 24 world's major river basins. Our findings demonstrate that an ARMA(1,1) model is sufficient for describing the noise in these regions. The results underscore the importance of considering colored noise, as it leads to approximately three times larger standard errors for the overall trend rate. This substantially impacts the significance of the trend rates. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.