41,379 results on '"Liu, Fang"'
Search Results
2. Phylogenetic relationships in Vicia subgenus Vicilla (Fabaceae) based on combined evidence from DNA sequences
- Author
-
Wu, Fei-Fei, Sun, Weihong, Liu, Fang, Gao, Qiu, Jin, Meiyan, Liu, Bowen, and Wang, Xian-Guo
- Published
- 2021
- Full Text
- View/download PDF
3. Analysis Methodology for Age of Information under Sequence Based Scheduling
- Author
-
Liu, Fang, Wong, Wing Shing, Lo, Yuan-Hsun, Zhang, Yijin, and Chen, Chung Shue
- Subjects
Computer Science - Information Theory - Abstract
We focus on the Age of Information (AoI) performance in a system where each user generates packets periodically to send to a common access point (AP) for status updating. To avoid heavy overhead, we assume that channel sensing, feedback information from the AP, and time synchronization are not available in the system. We adopt a multi-access scheme called the sequence scheme, where each user is assigned a periodic binary sequence to schedule their transmissions. In our previous work [18], we have thoroughly studied the AoI performance under sequence scheme when the period of schedule sequences, $L$, is equal to the status generating period, $T$. The results can be extended to the case where $T>L$. However, the case of $T
- Published
- 2024
4. SAFES: Sequential Privacy and Fairness Enhancing Data Synthesis for Responsible AI
- Author
-
Giddens, Spencer and Liu, Fang
- Subjects
Computer Science - Machine Learning ,Computer Science - Cryptography and Security - Abstract
As data-driven and AI-based decision making gains widespread adoption in most disciplines, it is crucial that both data privacy and decision fairness are appropriately addressed. While differential privacy (DP) provides a robust framework for guaranteeing privacy and several widely accepted methods have been proposed for improving fairness, the vast majority of existing literature treats the two concerns independently. For methods that do consider privacy and fairness simultaneously, they often only apply to a specific machine learning task, limiting their generalizability. In response, we introduce SAFES, a Sequential PrivAcy and Fairness Enhancing data Synthesis procedure that sequentially combines DP data synthesis with a fairness-aware data transformation. SAFES allows full control over the privacy-fairness-utility trade-off via tunable privacy and fairness parameters. We illustrate SAFES by combining AIM, a graphical model-based DP data synthesizer, with a popular fairness-aware data pre-processing transformation. Empirical evaluations on the Adult and COMPAS datasets demonstrate that for reasonable privacy loss, SAFES-generated synthetic data achieve significantly improved fairness metrics with relatively low utility loss.
- Published
- 2024
5. Towards Improved Preference Optimization Pipeline: from Data Generation to Budget-Controlled Regularization
- Author
-
Chen, Zhuotong, Liu, Fang, Zhu, Jennifer, Du, Wanyu, and Qi, Yanjun
- Subjects
Computer Science - Machine Learning ,Computer Science - Artificial Intelligence ,Computer Science - Computation and Language - Abstract
Direct Preference Optimization (DPO) and its variants have become the de facto standards for aligning large language models (LLMs) with human preferences or specific goals. However, DPO requires high-quality preference data and suffers from unstable preference optimization. In this work, we aim to improve the preference optimization pipeline by taking a closer look at preference data generation and training regularization techniques. For preference data generation, we demonstrate that existing scoring-based reward models produce unsatisfactory preference data and perform poorly on out-of-distribution tasks. This significantly impacts the LLM alignment performance when using these data for preference tuning. To ensure high-quality preference data generation, we propose an iterative pairwise ranking mechanism that derives preference ranking of completions using pairwise comparison signals. For training regularization, we observe that preference optimization tends to achieve better convergence when the LLM predicted likelihood of preferred samples gets slightly reduced. However, the widely used supervised next-word prediction regularization strictly prevents any likelihood reduction of preferred samples. This observation motivates our design of a budget-controlled regularization formulation. Empirically we show that combining the two designs leads to aligned models that surpass existing SOTA across two popular benchmarks., Comment: 15 pages
- Published
- 2024
6. SUANPAN: Scalable Photonic Linear Vector Machine
- Author
-
Yang, Ziyue, Li, Chen, Ran, Yuqia, Li, Yongzhuo, Feng, Xue, Cui, Kaiyu, Liu, Fang, Sun, Hao, Zhang, Wei, Ye, Yu, Qiao, Fei, Ning, Cun-Zheng, Wang, Jiaxing, Chang-Hasnain, Connie J., and Huang, Yidong
- Subjects
Physics - Optics - Abstract
Photonic linear operation is a promising approach to handle the extensive vector multiplications in artificial intelligence techniques due to the natural bosonic parallelism and high-speed information transmission of photonics. Although it is believed that maximizing the interaction of the light beams is necessary to fully utilize the parallelism and tremendous efforts have been made in past decades, the achieved dimensionality of vector-matrix multiplication is very limited due to the difficulty of scaling up a tightly interconnected or highly coupled optical system. Additionally, there is still a lack of a universal photonic computing architecture that can be readily merged with existing computing system to meet the computing power demand of AI techniques. Here, we propose a programmable and reconfigurable photonic linear vector machine to perform only the inner product of two vectors, formed by a series of independent basic computing units, while each unit is just one pair of light-emitter and photodetector. Since there is no interaction among light beams inside, extreme scalability could be achieved by simply duplicating the independent basic computing unit while there is no requirement of large-scale analog-to-digital converter and digital-to-analog converter arrays. Our architecture is inspired by the traditional Chinese Suanpan or abacus and thus is denoted as photonic SUANPAN. As a proof of principle, SUANPAN architecture is implemented with an 8*8 vertical cavity surface emission laser array and an 8*8 MoTe2 two-dimensional material photodetector array. We believe that our proposed photonic SUANPAN is capable of serving as a fundamental linear vector machine that can be readily merged with existing electronic digital computing system and is potential to enhance the computing power for future various AI applications.
- Published
- 2024
7. Ant Detective: An Automated Approach for Counting Ants in Densely Populated Images and Gaining Insight into Ant Foraging Behavior
- Author
-
Das, Mautushi, Liu, Fang-Ling Chloe, Hartle, Charly, Yang, Chin-Cheng Scotty, and Chen, C. P. James
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Ant foraging behavior is essential to understanding ecological dynamics and developing effective pest management strategies, but quantifying this behavior is challenging due to the labor-intensive nature of manual counting, especially in densely populated images. This study presents an automated approach using computer vision to count ants and analyze their foraging behavior. Leveraging the YOLOv8 model, the system was calibrated and evaluated on datasets encompassing various imaging scenarios and densities. The study results demonstrate that the system achieves average precision and recall of up to 87.96% and 87,78%, respectively, with only 64 calibration images provided when the both calibration and evaluation images share similar imaging backgrounds. When the background is more complex than the calibration images, the system requires a larger calibration set to generalize effectively, with 1,024 images yielding the precision and recall of up to 83.60% and 78.88, respectively. In more challenging scenarios where more than one thousand ants are present in a single image, the system significantly improves detection accuracy by slicing images into smaller patches, reaching a precision and recall of 77.97% and 71.36%, respectively. The system's ability to generate heatmaps visualizes the spatial distribution of ant activity over time, providing valuable insights into their foraging patterns. This spatial-temporal analysis enables a more comprehensive understanding of ant behavior, which is crucial for ecological studies and improving pest control methods. By automating the counting process and offering detailed behavioral analysis, this study provides an efficient tool for researchers and pest control professionals to develop more effective strategies.
- Published
- 2024
8. Application of an ANN and LSTM-based Ensemble Model for Stock Market Prediction
- Author
-
Liu, Fang, Guo, Shaobo, Xing, Qianwen, Sha, Xinye, Chen, Ying, Jin, Yuhui, Zheng, Qi, and Yu, Chang
- Subjects
Computer Science - Computational Engineering, Finance, and Science - Abstract
Stock trading has always been a key economic indicator in modern society and a primary source of profit for financial giants such as investment banks, quantitative trading firms, and hedge funds. Discovering the underlying patterns within the seemingly volatile yet intrinsically structured economic activities has become a central focus of research for many companies. Our study leverages widely-used modern financial forecasting algorithms, including LSTM, ANN, CNN, and BiLSTM. We begin by comparing the predictive performance of these well-known algorithms on our stock market data, utilizing metrics such as R2, MAE, MSE, RMSE for detailed evaluation. Based on the performance of these models, we then aim to combine their strengths while mitigating their weaknesses, striving to construct a powerful hybrid model that overcomes the performance limitations of individual models.Through rigorous experimentation and exploration, we ultimately developed an LSTM+ANN model that breaks through prior performance bottlenecks, achieving promising and exciting results., Comment: This paper is accepted by ICISCAE 2024
- Published
- 2024
9. FastFixer: An Efficient and Effective Approach for Repairing Programming Assignments
- Author
-
Liu, Fang, Liu, Zhenwei, Zhao, Qianhui, Jiang, Jing, Zhang, Li, Li, Ge, Sun, Zian, Li, Zhongqi, and Ma, Yuchi
- Subjects
Computer Science - Computers and Society ,Computer Science - Software Engineering - Abstract
Providing personalized and timely feedback for student's programming assignments is useful for programming education. Automated program repair (APR) techniques have been used to fix the bugs in programming assignments, where the Large Language Models (LLMs) based approaches have shown promising results. Given the growing complexity of identifying and fixing bugs in advanced programming assignments, current fine-tuning strategies for APR are inadequate in guiding the LLM to identify bugs and make accurate edits during the generative repair process. Furthermore, the autoregressive decoding approach employed by the LLM could potentially impede the efficiency of the repair, thereby hindering the ability to provide timely feedback. To tackle these challenges, we propose FastFixer, an efficient and effective approach for programming assignment repair. To assist the LLM in accurately identifying and repairing bugs, we first propose a novel repair-oriented fine-tuning strategy, aiming to enhance the LLM's attention towards learning how to generate the necessary patch and its associated context. Furthermore, to speed up the patch generation, we propose an inference acceleration approach that is specifically tailored for the program repair task. The evaluation results demonstrate that FastFixer obtains an overall improvement of 20.46% in assignment fixing when compared to the state-of-the-art baseline. Considering the repair efficiency, FastFixer achieves a remarkable inference speedup of 16.67 times compared to the autoregressive decoding algorithm., Comment: Accepted by the 39th IEEE/ACM International Conference on Automated Software Engineering (ASE 2024)
- Published
- 2024
10. Observation of an axial-vector state in the study of $\psi(3686) \to \phi \eta \eta'$ decay
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Afedulidis, O., Ai, X. C., Aliberti, R., Amoroso, A., An, Q., Bai, Y., Bakina, O., Balossino, I., Ban, Y., Bao, H. -R., Batozskaya, V., Begzsuren, K., Berger, N., Berlowski, M., Bertani, M., Bettoni, D., Bianchi, F., Bianco, E., Bortone, A., Boyko, I., Briere, R. A., Brueggemann, A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Chang, W. L., Che, G. R., Chelkov, G., Chen, C., Chen, C. H., Chen, Chao, Chen, G., Chen, H. S., Chen, M. L., Chen, S. J., Chen, S. L., Chen, S. M., Chen, T., Chen, X. R., Chen, X. T., Chen, Y. B., Chen, Y. Q., Chen, Z. J., Chen, Z. Y., Choi, S. K., Chu, X., Cibinetto, G., Cossio, F., Cui, J. J., Dai, H. L., Dai, J. P., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, C. Q., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, B., Ding, X. X., Ding, Y., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, M. C., Du, S. X., Duan, Z. H., Egorov, P., Fan, Y. H., Fang, J., Fang, S. S., Fang, W. X., Fang, Y., Fang, Y. Q., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Feng, Y. T., Fischer, K., Fritsch, M., Fu, C. D., Fu, J. L., Fu, Y. W., Gao, H., Gao, Y. N., Gao, Yang, Garbolino, S., Garzia, I., Ge, L., Ge, P. T., Ge, Z. W., Geng, C., Gersabeck, E. M., Gilman, A., Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Gramigna, S., Greco, M., Gu, M. H., Gu, Y. T., Guan, C. Y., Guan, Z. L., Guo, A. Q., Guo, L. B., Guo, M. J., Guo, R. P., Guo, Y. P., Guskov, A., Gutierrez, J., Han, K. L., Han, T. T., Hao, X. Q., Harris, F. A., He, K. K., He, K. L., Heinsius, F. H., Heinz, C. H., Heng, Y. K., Herold, C., Holtmann, T., Hong, P. C., Hou, G. Y., Hou, X. T., Hou, Y. R., Hou, Z. L., Hu, B. Y., Hu, H. M., Hu, J. F., Hu, T., Hu, Y., Huang, G. S., Huang, K. X., Huang, L. Q., Huang, X. T., Huang, Y. P., Hussain, T., Hölzken, F., Hüsken, N, der Wiesche, N. in, Irshad, M., Jackson, J., Janchiv, S., Jeong, J. H., Ji, Q., Ji, Q. P., Ji, W., Ji, X. B., Ji, X. L., Ji, Y. Y., Jia, X. Q., Jia, Z. K., Jiang, D., Jiang, H. B., Jiang, P. C., Jiang, S. S., Jiang, T. J., Jiang, X. S., Jiang, Y., Jiao, J. B., Jiao, J. K., Jiao, Z., Jin, S., Jin, Y., Jing, M. Q., Jing, X. M., Johansson, T., Kabana, S., Kalantar-Nayestanaki, N., Kang, X. L., Kang, X. S., Kavatsyuk, M., Ke, B. C., Khachatryan, V., Khoukaz, A., Kiuchi, R., Kolcu, O. B., Kopf, B., Kuessner, M., Kui, X., Kumar, N., Kupsc, A., Kühn, W., Lane, J. J., Larin, P., Lavezzi, L., Lei, T. T., Lei, Z. H., Leithoff, H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H., Li, H. B., Li, H. J., Li, H. N., Li, Hui, Li, J. R., Li, J. S., Li, Ke, Li, L. J, Li, L. K., Li, Lei, Li, M. H., Li, P. R., Li, Q. M., Li, Q. X., Li, R., Li, S. X., Li, T., Li, W. D., Li, W. G., Li, X., Li, X. H., Li, X. L., Li, Xiaoyu, Li, Y. G., Li, Z. J., Li, Z. X., Liang, C., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Liao, Y. P., Libby, J., Limphirat, A., Lin, D. X., Lin, T., Liu, B. J., Liu, B. X., Liu, C., Liu, C. X., Liu, F. H., Liu, Fang, Liu, Feng, Liu, G. M., Liu, H., Liu, H. B., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, L. C., Liu, Lu, Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, T., Liu, W. K., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. D., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Lu, Z. H., Luo, C. L., Luo, M. X., Luo, T., Luo, X. L., Lyu, X. R., Lyu, Y. F., Ma, F. C., Ma, H., Ma, H. L., Ma, J. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, X. T., Ma, X. Y., Ma, Y., Ma, Y. M., Maas, F. E., Maggiora, M., Malde, S., Mangoni, A., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Miao, H., Min, T. J., Mitchell, R. E., Mo, X. H., Moses, B., Muchnoi, N. Yu., Muskalla, J., Nefedov, Y., Nerling, F., Nikolaev, I. B., Ning, Z., Nisar, S., Niu, Q. L., Niu, W. D., Niu, Y., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Patteri, P., Pei, Y. P., Pelizaeus, M., Peng, H. P., Peng, Y. Y., Peters, K., Ping, J. L., Ping, R. G., Plura, S., Prasad, V., Qi, F. Z., Qi, H., Qi, H. R., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qiao, C. F., Qiao, X. K., Qin, J. J., Qin, L. Q., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, S. Q., Qu, Z. H., Redmer, C. F., Ren, K. J., Rivetti, A., Rolo, M., Rong, G., Rosner, Ch., Ruan, S. N., Salone, N., Sarantsev, A., Schelhaas, Y., Schoenning, K., Scodeggio, M., Shan, K. Y., Shan, W., Shan, X. Y., Shang, Z. J, Shangguan, J. F., Shao, L. G., Shao, M., Shen, C. P., Shen, H. F., Shen, W. H., Shen, X. Y., Shi, B. A., Shi, H. C., Shi, J. L., Shi, J. Y., Shi, Q. Q., Shi, R. S., Shi, S. Y., Shi, X., Song, J. J., Song, T. Z., Song, W. M., Song, Y. J., Song, Y. X., Sosio, S., Spataro, S., Stieler, F., Su, Y. J., Sun, G. B., Sun, G. X., Sun, H., Sun, H. K., Sun, J. F., Sun, K., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, Y., Sun, Y. J., Sun, Y. Z., Sun, Z. Q., Sun, Z. T., Tang, C. J., Tang, G. Y., Tang, J., Tang, Y. A., Tao, L. Y., Tao, Q. T., Tat, M., Teng, J. X., Thoren, V., Tian, W. H., Tian, Y., Tian, Z. F., Uman, I., Wan, Y., Wang, S. J., Wang, B., Wang, B. L., Wang, Bo, Wang, D. Y., Wang, F., Wang, H. J., Wang, J. P., Wang, K., Wang, L. L., Wang, M., Wang, Meng, Wang, N. Y., Wang, S., Wang, T., Wang, T. J., Wang, W., Wang, W. P., Wang, X., Wang, X. F., Wang, X. J., Wang, X. L., Wang, X. N., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. L., Wang, Y. N., Wang, Y. Q., Wang, Yaqian, Wang, Yi, Wang, Z., Wang, Z. L., Wang, Z. Y., Wang, Ziyi, Wei, D., Wei, D. H., Weidner, F., Wen, S. P., Wen, Y. R., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, C., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, X. H., Wu, Y., Wu, Y. H., Wu, Y. J., Wu, Z., Xia, L., Xian, X. M., Xiang, B. H., Xiang, T., Xiao, D., Xiao, G. Y., Xiao, S. Y., Xiao, Y. L., Xiao, Z. J., Xie, C., Xie, X. H., Xie, Y., Xie, Y. G., Xie, Y. H., Xie, Z. P., Xing, T. Y., Xu, C. F., Xu, C. J., Xu, G. F., Xu, H. Y., Xu, Q. J., Xu, Q. N., Xu, W., Xu, W. L., Xu, X. P., Xu, Y. C., Xu, Z. P., Xu, Z. S., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yan, X. Q., Yang, H. J., Yang, H. L., Yang, H. X., Yang, Tao, Yang, Y., Yang, Y. F., Yang, Y. X., Yang, Yifan, Yang, Z. W., Yao, Z. P., Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, J. S., Yu, T., Yu, X. D., Yu, Y. C., Yuan, C. Z., Yuan, J., Yuan, L., Yuan, S. C., Yuan, Y., Yuan, Z. Y., Yue, C. X., Zafar, A. A., Zeng, F. R., Zeng, S. H., Zeng, X., Zeng, Y., Zeng, Y. J., Zhai, X. Y., Zhai, Y. C., Zhan, Y. H., Zhang, A. Q., Zhang, B. L., Zhang, B. X., Zhang, D. H., Zhang, G. Y., Zhang, H., Zhang, H. C., Zhang, H. H., Zhang, H. Q., Zhang, H. Y., Zhang, J., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. W., Zhang, J. X., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, L. M., Zhang, Lei, Zhang, P., Zhang, Q. Y., Zhang, R. Y, Zhang, Shuihan, Zhang, Shulei, Zhang, X. D., Zhang, X. M., Zhang, X. Y., Zhang, Y., Zhang, Y. T., Zhang, Y. H., Zhang, Y. M., Zhang, Yan, Zhang, Yao, Zhang, Z. D., Zhang, Z. H., Zhang, Z. L., Zhang, Z. Y., Zhao, G., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, R. P., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, J. P., Zheng, W. J., Zheng, Y. H., Zhong, B., Zhong, X., Zhou, H., Zhou, J. Y., Zhou, L. P., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhou, Y. Z., Zhu, J., Zhu, K., Zhu, K. J., Zhu, L., Zhu, L. X., Zhu, S. H., Zhu, S. Q., Zhu, T. J., Zhu, W. J., Zhu, Y. C., Zhu, Z. A., Zou, J. H., and Zu, J.
- Subjects
High Energy Physics - Experiment - Abstract
Using (2712.4 $\pm$ 14.3)$\times 10^{6}$ $\psi(3686)$ events collected with the BESIII detector at BEPCII, a partial wave analysis of the decay $\psi(3686) \to \phi \eta \eta' $ is performed with the covariant tensor approach. An axial-vector state with a mass near 2.3 $\rm GeV/c^2$ is observed for the first time. Its mass and width are measured to be 2316 $\pm 9_{\mathrm{stat}} \pm 30_{\mathrm{syst}}\,\rm MeV/c^2$ and 89 $\pm 15_{\mathrm{stat}} \pm 26_{\mathrm{syst}}\,\rm MeV$, respectively. The product branching fractions of $\mathcal{B}(\psi(3686) \to X(2300) \eta') \mathcal{B}(X(2300)\to \phi \eta)$ and $\mathcal{B}(\psi(3686) \to X(2300) \eta)\mathcal{B}(X(2300)\to \phi \eta')$ are determined to be (4.8 $\pm 1.3_{\mathrm{stat}} \pm 0.7_{\mathrm{syst}})\times 10^{-6}$ and (2.2 $\pm 0.7_{\mathrm{stat}} \pm 0.7_{\mathrm{syst}})\times 10^{-6}$, respectively. The branching fraction $\mathcal{B}(\psi(3686) \to \phi \eta \eta')$ is measured for the first time to be (3.14$\pm0.17_{\mathrm{stat}}\pm0.24_{\mathrm{syst}})\times10^{-5}$. The first uncertainties are statistical and the second are systematic.
- Published
- 2024
11. Single-atom-resolved vibrational spectroscopy of a dislocation
- Author
-
Jiang, Hailing, Wang, Tao, Zhang, Zhenyu, Shi, Ruochen, Xu, Xifan, Sheng, Bowen, Liu, Fang, Ge, Weikun, Wang, Ping, Shen, Bo, Gao, Peng, Lindsay, Lucas R, and Wang, Xinqiang
- Subjects
Condensed Matter - Materials Science - Abstract
Phonon resistance from dislocation scattering is often divided into short-range core interactions and long-range strain field interactions. Using electron energy-loss spectroscopy on a GaN dislocation, we report observations of vibrational modes localized at specific core atoms (short-range) and strain-driven phonon energy shifts around the dislocation (long-range). Ab initio calculations support these findings and draw out additional details. This study reveals atomically resolved vibrational spectra of dislocations, thus offering insights for engineering improved material functionalities.
- Published
- 2024
12. Room-temperature valley-selective emission in Si-MoSe2 heterostructures enabled by high-quality-factor chiroptical cavities
- Author
-
Pan, Feng, Li, Xin, Johnson, Amalya C., Dhuey, Scott, Saunders, Ashley, Hu, Meng-Xia, Dixon, Jefferson P., Dagli, Sahil, Lau, Sze-Cheung, Weng, Tingting, Chen, Chih-Yi, Zeng, Jun-Hao, Apte, Rajas, Heinz, Tony F., Liu, Fang, Deng, Zi-Lan, and Dionne, Jennifer A.
- Subjects
Physics - Optics ,Condensed Matter - Materials Science - Abstract
Transition metal dichalcogenides (TMDCs) possess valley pseudospin, allowing photon spin to be coupled to electron spin and enabling initialization and readout of both classical and quantum information. Rapid valley-dephasing processes have impeded the development of scalable, high-performance valleytronic devices operating at room temperature. Here we demonstrate that a chiral resonant metasurface can enable room-temperature valley-selective emission, even with linearly polarized excitation. This platform provides circular eigen-polarization states with a high quality factor (Q-factor) and strong chiral near-field enhancement, resulting in unitary emission circular dichroism (i.e. single-handed circularly polarized emission). Our fabricated Si chiral metasurfaces exhibit chiral electromagnetic modes with Q-factors up to 450 at visible wavelengths, spectrally tuned to the exciton energy of MoSe2 monolayers. Using spatially- and spectrally-resolved mapping from temperatures of 100 K to 294 K, we demonstrate degrees of circular polarization (DOP) reaching a record high of 0.5 at room temperature. Reciprocal space mapping of the exciton emission reveals the chiral q-BIC localizes valley-selective emission in the vicinity of the photonic gamma-point. Photon-spin and time-resolved photoluminescence measurements show that the high DOP can be attributed to the significantly increased chiroptical local density of states provided by the metasurface, which enhances valley-specific radiative transition rates by a factor of approximately 13, with lifetimes as short as 189 ps. Our work could facilitate the development of compact chiral classical and quantum light sources and the creation of molecular chiral polaritons for quantum enantioselective synthesis.
- Published
- 2024
13. LSVOS Challenge Report: Large-scale Complex and Long Video Object Segmentation
- Author
-
Ding, Henghui, Hong, Lingyi, Liu, Chang, Xu, Ning, Yang, Linjie, Fan, Yuchen, Miao, Deshui, Gu, Yameng, Li, Xin, He, Zhenyu, Wang, Yaowei, Yang, Ming-Hsuan, Chai, Jinming, Ma, Qin, Zhang, Junpei, Jiao, Licheng, Liu, Fang, Liu, Xinyu, Zhang, Jing, Zhang, Kexin, Liu, Xu, Li, LingLing, Fang, Hao, Pan, Feiyu, Lu, Xiankai, Zhang, Wei, Cong, Runmin, Tran, Tuyen, Cao, Bin, Zhang, Yisi, Wang, Hanyi, He, Xingjian, and Liu, Jing
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Despite the promising performance of current video segmentation models on existing benchmarks, these models still struggle with complex scenes. In this paper, we introduce the 6th Large-scale Video Object Segmentation (LSVOS) challenge in conjunction with ECCV 2024 workshop. This year's challenge includes two tasks: Video Object Segmentation (VOS) and Referring Video Object Segmentation (RVOS). In this year, we replace the classic YouTube-VOS and YouTube-RVOS benchmark with latest datasets MOSE, LVOS, and MeViS to assess VOS under more challenging complex environments. This year's challenge attracted 129 registered teams from more than 20 institutes across over 8 countries. This report include the challenge and dataset introduction, and the methods used by top 7 teams in two tracks. More details can be found in our homepage https://lsvos.github.io/., Comment: ECCV 2024 LSVOS Challenge Report: https://lsvos.github.io/
- Published
- 2024
14. Renormalized Connection for Scale-preferred Object Detection in Satellite Imagery
- Author
-
Zhang, Fan, Li, Lingling, Jiao, Licheng, Liu, Xu, Liu, Fang, Yang, Shuyuan, and Hou, Biao
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Satellite imagery, due to its long-range imaging, brings with it a variety of scale-preferred tasks, such as the detection of tiny/small objects, making the precise localization and detection of small objects of interest a challenging task. In this article, we design a Knowledge Discovery Network (KDN) to implement the renormalization group theory in terms of efficient feature extraction. Renormalized connection (RC) on the KDN enables ``synergistic focusing'' of multi-scale features. Based on our observations of KDN, we abstract a class of RCs with different connection strengths, called n21C, and generalize it to FPN-based multi-branch detectors. In a series of FPN experiments on the scale-preferred tasks, we found that the ``divide-and-conquer'' idea of FPN severely hampers the detector's learning in the right direction due to the large number of large-scale negative samples and interference from background noise. Moreover, these negative samples cannot be eliminated by the focal loss function. The RCs extends the multi-level feature's ``divide-and-conquer'' mechanism of the FPN-based detectors to a wide range of scale-preferred tasks, and enables synergistic effects of multi-level features on the specific learning goal. In addition, interference activations in two aspects are greatly reduced and the detector learns in a more correct direction. Extensive experiments of 17 well-designed detection architectures embedded with n21s on five different levels of scale-preferred tasks validate the effectiveness and efficiency of the RCs. Especially the simplest linear form of RC, E421C performs well in all tasks and it satisfies the scaling property of RGT. We hope that our approach will transfer a large number of well-designed detectors from the computer vision community to the remote sensing community., Comment: 24 pages, 14 figures Journal
- Published
- 2024
- Full Text
- View/download PDF
15. CSS-Segment: 2nd Place Report of LSVOS Challenge VOS Track
- Author
-
Chai, Jinming, Ma, Qin, Zhang, Junpei, Jiao, Licheng, and Liu, Fang
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Video object segmentation is a challenging task that serves as the cornerstone of numerous downstream applications, including video editing and autonomous driving. In this technical report, we briefly introduce the solution of our team "yuanjie" for video object segmentation in the 6-th LSVOS Challenge VOS Track at ECCV 2024. We believe that our proposed CSS-Segment will perform better in videos of complex object motion and long-term presentation. In this report, we successfully validated the effectiveness of the CSS-Segment in video object segmentation. Finally, our method achieved a J\&F score of 80.84 in and test phases, and ultimately ranked 2nd in the 6-th LSVOS Challenge VOS Track at ECCV 2024.
- Published
- 2024
16. GaussianOcc: Fully Self-supervised and Efficient 3D Occupancy Estimation with Gaussian Splatting
- Author
-
Gan, Wanshui, Liu, Fang, Xu, Hongbin, Mo, Ningkai, and Yokoya, Naoto
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
We introduce GaussianOcc, a systematic method that investigates the two usages of Gaussian splatting for fully self-supervised and efficient 3D occupancy estimation in surround views. First, traditional methods for self-supervised 3D occupancy estimation still require ground truth 6D poses from sensors during training. To address this limitation, we propose Gaussian Splatting for Projection (GSP) module to provide accurate scale information for fully self-supervised training from adjacent view projection. Additionally, existing methods rely on volume rendering for final 3D voxel representation learning using 2D signals (depth maps, semantic maps), which is both time-consuming and less effective. We propose Gaussian Splatting from Voxel space (GSV) to leverage the fast rendering properties of Gaussian splatting. As a result, the proposed GaussianOcc method enables fully self-supervised (no ground truth pose) 3D occupancy estimation in competitive performance with low computational cost (2.7 times faster in training and 5 times faster in rendering). The relevant code will be available in https://github.com/GANWANSHUI/GaussianOcc.git., Comment: Project page: https://ganwanshui.github.io/GaussianOcc/
- Published
- 2024
17. Enhancing Automated Program Repair with Solution Design
- Author
-
Zhao, Jiuang, Yang, Donghao, Zhang, Li, Lian, Xiaoli, Yang, Zitian, and Liu, Fang
- Subjects
Computer Science - Software Engineering ,Computer Science - Artificial Intelligence - Abstract
Automatic Program Repair (APR) endeavors to autonomously rectify issues within specific projects, which generally encompasses three categories of tasks: bug resolution, new feature development, and feature enhancement. Despite extensive research proposing various methodologies, their efficacy in addressing real issues remains unsatisfactory. It's worth noting that, typically, engineers have design rationales (DR) on solution-planed solutions and a set of underlying reasons-before they start patching code. In open-source projects, these DRs are frequently captured in issue logs through project management tools like Jira. This raises a compelling question: How can we leverage DR scattered across the issue logs to efficiently enhance APR? To investigate this premise, we introduce DRCodePilot, an approach designed to augment GPT-4-Turbo's APR capabilities by incorporating DR into the prompt instruction. Furthermore, given GPT-4's constraints in fully grasping the broader project context and occasional shortcomings in generating precise identifiers, we have devised a feedback-based self-reflective framework, in which we prompt GPT-4 to reconsider and refine its outputs by referencing a provided patch and suggested identifiers. We have established a benchmark comprising 938 issue-patch pairs sourced from two open-source repositories hosted on GitHub and Jira. Our experimental results are impressive: DRCodePilot achieves a full-match ratio that is a remarkable 4.7x higher than when GPT-4 is utilized directly. Additionally, the CodeBLEU scores also exhibit promising enhancements. Moreover, our findings reveal that the standalone application of DR can yield promising increase in the full-match ratio across CodeLlama, GPT-3.5, and GPT-4 within our benchmark suite. We believe that our DRCodePilot initiative heralds a novel human-in-the-loop avenue for advancing the field of APR., Comment: *These authors contributed equally to this work. {\dag}Corresponding author. Will appear in ase'24
- Published
- 2024
18. DC-DC Converters Optimization in Case of Large Variation in the Load
- Author
-
Domyshev, Alexander, Chistyakova, Elena, Dreglea, Aliona, Sidorov, Denis, and Liu, Fang
- Subjects
Electrical Engineering and Systems Science - Systems and Control ,Mathematics - Dynamical Systems ,34H05 34A09 - Abstract
The method for controlling a DC-DC converter is proposed to ensures the high quality control at large fluctuations in load currents by using differential gain control coefficients and second derivative control. Various implementations of balancing the currents of a multiphase DC-DC converter are discussed, with a focus on achieving accurate current regulation without introducing additional delay in the control system. Stochastic particle swarm optimization method is used to find optimal values of the PID controller parameters. An automatic constraint-handling in optimization are also discussed as relevant techniques in the field.
- Published
- 2024
19. Uncovering Weaknesses in Neural Code Generation
- Author
-
Lian, Xiaoli, Wang, Shuaisong, Ma, Jieping, Liu, Fang, Tan, Xin, Zhang, Li, Shi, Lin, and Gao, Cuiyun
- Subjects
Computer Science - Software Engineering - Abstract
Code generation, the task of producing source code from prompts, has seen significant advancements with the advent of pre-trained large language models (PLMs). Despite these achievements, there lacks a comprehensive taxonomy of weaknesses about the benchmark and the generated code, which risks the community's focus on known issues at the cost of under-explored areas. Our systematic study aims to fill this gap by evaluating five state-of-the-art PLMs: three larger models, CodeGen2.5 with 7 billion parameters, CodeGeeX2 with 6 billion parameters, GPT-4 Turbo, and two smaller ones, UnixCoder with 110 million parameters and CodeT5 base with 220 million parameters, across three popular datasets, CoNaLa, HumanEval Plus, and DS-1000. We assess the quality of generated code using match-based and execution-based metrics, then conduct thematic analysis to develop a taxonomy of nine types of weaknesses. We dissected weakness distributions in both larger and smaller models, applying an extensive methodology that encompasses model-specific as well as collective analysis (union and intersection) across models. Our research uncovers three salient findings: 1. In the CoNaLa dataset, inaccurate prompts are a notable problem, causing all large models to fail in 26.84% of cases, with even higher failure rates of 40% for smaller models; 2. Missing pivotal semantics is a pervasive issue across benchmarks, with one or more large models omitting key semantics in 65.78% of CoNaLa tasks, and similarly high occurrences in HumanEval Plus (66.09%) and DS-1000 (80.51%); 3. All models struggle with proper API usage, a challenge amplified by vague or complex prompts. Our findings aim to steer researchers towards addressing specific weaknesses and challenges in code generation. Furthermore, our annotations can offer a targeted benchmark subset for detailed analysis.
- Published
- 2024
20. Macroscopic uniform 2D moir\'e superlattices with controllable angles
- Author
-
Zaborski Jr., Gregory, Majchrzak, Paulina E., Lai, Samuel, Johnson, Amalya C., Saunders, Ashley P., Zhu, Ziyan, Deng, Yujun, Lu, Donghui, Hashimoto, Makoto, Shen, Z-X, and Liu, Fang
- Subjects
Condensed Matter - Materials Science ,Condensed Matter - Mesoscale and Nanoscale Physics - Abstract
Moir\'e superlattices, engineered through precise stacking of van der Waals (vdW) layers, hold immense promise for exploring strongly correlated and topological phenomena. However, these applications have been held back by the common preparation method: tear-and-stack of Scotch tape exfoliated monolayers. It has low efficiency and reproducibility, along with challenges of twist angle inhomogeneity, interfacial contamination, micrometer sizes, and a tendency to untwist at elevated temperatures. Here we report an effective strategy to construct highly consistent vdW moir\'e structures with high production throughput, near-unity yield, pristine interfaces, precisely controlled twist angles, and macroscopic scale (up to centimeters) with enhanced thermal stability. We further demonstrate the versatility across various vdW materials including transition metal dichalcogenides, graphene, and hBN. The expansive size and high quality of moir\'e structures enables high-resolution mapping of the reciprocal space back-folded lattices and moir\'e mini band structures with low energy electron diffraction (LEED) and angle-resolved photoemission spectroscopy (ARPES). This technique will have broad applications in both fundamental studies and mass production of twistronic devices., Comment: 16 pages, 4 figures
- Published
- 2024
21. Fast and Efficient: Mask Neural Fields for 3D Scene Segmentation
- Author
-
Gao, Zihan, Li, Lingling, Jiao, Licheng, Liu, Fang, Liu, Xu, Ma, Wenping, Guo, Yuwei, and Yang, Shuyuan
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Understanding 3D scenes is a crucial challenge in computer vision research with applications spanning multiple domains. Recent advancements in distilling 2D vision-language foundation models into neural fields, like NeRF and 3DGS, enables open-vocabulary segmentation of 3D scenes from 2D multi-view images without the need for precise 3D annotations. While effective, however, the per-pixel distillation of high-dimensional CLIP features introduces ambiguity and necessitates complex regularization strategies, adding inefficiencies during training. This paper presents MaskField, which enables fast and efficient 3D open-vocabulary segmentation with neural fields under weak supervision. Unlike previous methods, MaskField distills masks rather than dense high-dimensional CLIP features. MaskFields employ neural fields as binary mask generators and supervise them with masks generated by SAM and classified by coarse CLIP features. MaskField overcomes the ambiguous object boundaries by naturally introducing SAM segmented object shapes without extra regularization during training. By circumventing the direct handling of high-dimensional CLIP features during training, MaskField is particularly compatible with explicit scene representations like 3DGS. Our extensive experiments show that MaskField not only surpasses prior state-of-the-art methods but also achieves remarkably fast convergence, outperforming previous methods with just 5 minutes of training. We hope that MaskField will inspire further exploration into how neural fields can be trained to comprehend 3D scenes from 2D models., Comment: 16 pages, 7 figures
- Published
- 2024
22. Observation of the Electromagnetic Dalitz Transition $h_c \rightarrow e^+e^-\eta_c$
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Ahmed, S., Albrecht, M., Aliberti, R., Amoroso, A., An, M. R., An, Q., Bai, X. H., Bai, Y., Bakina, O., Ferroli, R. Baldini, Balossino, I., Ban, Y., Begzsuren, K., Berger, N., Bertani, M., Bettoni, D., Bianchi, F., Bloms, J., Bortone, A., Boyko, I., Briere, R. A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Chang, W. L., Chelkov, G., Chen, D. Y., Chen, G., Chen, H. S., Chen, M. L., Chen, S. J., Chen, X. R., Chen, Y. B., Chen, Z. J, Cheng, W. S., Cibinetto, G., Cossio, F., Cui, X. F., Dai, H. L., Dai, X. C., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, Y., Dong, C., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, S. X., Fan, Y. L., Fang, J., Fang, S. S., Fang, Y., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Fritsch, M., Fu, C. D., Gao, Y., Gao, Y. G., Garzia, I., Ge, P. T., Geng, C., Gersabeck, E. M., Gilman, A, Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Greco, M., Gu, L. M., Gu, M. H., Gu, S., Gu, Y. T., Guan, C. Y, Guo, A. Q., Guo, L. B., Guo, R. P., Guo, Y. P., Guskov, A., Han, T. T., Han, W. Y., Hao, X. Q., Harris, F. A., Hüsken, N, He, K. L., Heinsius, F. H., Heinz, C. H., Held, T., Heng, Y. K., Herold, C., Himmelreich, M., Holtmann, T., Hou, Y. R., Hou, Z. L., Hu, H. M., Hu, J. F., Hu, T., Hu, Y., Huang, G. S., Huang, L. Q., Huang, X. T., Huang, Y. P., Huang, Z., Hussain, T., Andersson, W. Ikegami, Imoehl, W., Irshad, M., Jaeger, S., Janchiv, S., Ji, Q., Ji, Q. P., Ji, X. B., Ji, X. L., Ji, Y. Y., Jiang, H. B., Jiang, X. S., Jiao, J. B., Jiao, Z., Jin, S., Jin, Y., Johansson, T., Kalantar-Nayestanaki, N., Kang, X. S., Kappert, R., Kavatsyuk, M., Ke, B. C., Keshk, I. K., Khoukaz, A., Kiese, P., Kiuchi, R., Kliemt, R., Koch, L., Kolcu, O. B., Kopf, B., Kuemmel, M., Kuessner, M., Kupsc, A., Kurth, M. G., Kühn, W., Lane, J. J., Lange, J. S., Larin, P., Lavania, A., Lavezzi, L., Lei, Z. H., Leithoff, H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H., Li, H. B., Li, H. J., Li, J. L., Li, J. Q., Li, J. S., Li, Ke, Li, L. K., Li, Lei, Li, P. R., Li, S. Y., Li, W. D., Li, W. G., Li, X. H., Li, X. L., Li, Xiaoyu, Li, Z. Y., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Libby, J., Lin, C. X., Liu, B. J., Liu, C. X., Liu, D., Liu, F. H., Liu, Fang, Liu, Feng, Liu, H. B., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. L., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, Shuai, Liu, T., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. D., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Luo, C. L., Luo, M. X., Luo, P. W., Luo, T., Luo, X. L., Lusso, S., Lyu, X. R., Ma, F. C., Ma, H. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, R. T., Ma, X. X., Ma, X. Y., Maas, F. E., Maggiora, M., Maldaner, S., Malde, S., Malik, Q. A., Mangoni, A., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Min, T. J., Mitchell, R. E., Mo, X. H., Mo, Y. J., Muchnoi, N. Yu., Muramatsu, H., Nakhoul, S., Nefedov, Y., Nerling, F., Nikolaev, I. B., Ning, Z., Nisar, S., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Patteri, P., Pelizaeus, M., Peng, H. P., Peters, K., Pettersson, J., Ping, J. L., Ping, R. G., Poling, R., Prasad, V., Qi, H., Qi, H. R., Qi, K. H., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qian, Z., Qiao, C. F., Qin, L. Q., Qin, X. P., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, S. Q., Rashid, K. H., Ravindran, K., Redmer, C. F., Rivetti, A., Rodin, V., Rolo, M., Rong, G., Rosner, Ch., Rump, M., Sang, H. S., Sarantsev, A., Schelhaas, Y., Schnier, C., Schoenning, K., Scodeggio, M., Shan, D. C., Shan, W., Shan, X. Y., Shangguan, J. F., Shao, M., Shen, C. P., Shen, P. X., Shen, X. Y., Shi, H. C., Shi, R. S., Shi, X., Shi, X. D, Song, J. J., Song, W. M., Song, Y. X., Sosio, S., Spataro, S., Su, K. X., Su, P. P., Sui, F. F., Sun, G. X., Sun, H. K., Sun, J. F., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, X, Sun, Y. J., Sun, Y. K., Sun, Y. Z., Sun, Z. T., Tan, Y. H., Tan, Y. X., Tang, C. J., Tang, G. Y., Tang, J., Teng, J. X., Thoren, V., Tian, Y. T., Uman, I., Wang, B., Wang, C. W., Wang, D. Y., Wang, H. J., Wang, H. P., Wang, K., Wang, L. L., Wang, M., Wang, M. Z., Wang, Meng, Wang, W., Wang, W. H., Wang, W. P., Wang, X., Wang, X. F., Wang, X. L., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. Q., Wang, Y. Y., Wang, Z., Wang, Z. Y., Wang, Ziyi, Wang, Zongyuan, Wei, D. H., Weidenkaff, P., Weidner, F., Wen, S. P., White, D. J., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, Z., Xia, L., Xiao, H., Xiao, S. Y., Xiao, Z. J., Xie, X. H., Xie, Y. G., Xie, Y. H., Xing, T. Y., Xu, G. F., Xu, Q. J., Xu, W., Xu, X. P., Xu, Y. C., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yan, Xu, Yang, H. J., Yang, H. X., Yang, L., Yang, S. L., Yang, Y. X., Yang, Yifan, Yang, Zhi, Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, J. S., Yu, T., Yuan, C. Z., Yuan, L., Yuan, X. Q., Yuan, Y., Yuan, Z. Y., Yue, C. X., Yuncu, A., Zafar, A. A., Zeng, Y., Zhang, B. X., Zhang, Guangyi, Zhang, H., Zhang, H. H., Zhang, H. Y., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. W., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, Jiawei, Zhang, L. M., Zhang, L. Q., Zhang, Lei, Zhang, S., Zhang, S. F., Zhang, Shulei, Zhang, X. D., Zhang, X. Y., Zhang, Y., Zhang, Y. H., Zhang, Y. T., Zhang, Yan, Zhang, Yao, Zhang, Yi, Zhang, Z. H., Zhang, Z. Y., Zhao, G., Zhao, J., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, Q., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, J. P., Zheng, W. J., Zheng, Y., Zheng, Y. H., Zhong, B., Zhong, C., Zhou, L. P., Zhou, Q., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhu, A. N., Zhu, J., Zhu, K., Zhu, K. J., Zhu, S. H., Zhu, T. J., Zhu, W. J., Zhu, Y. C., Zhu, Z. A., Zou, B. S., and Zou, J. H.
- Subjects
High Energy Physics - Experiment - Abstract
Using $(27.12\pm 0.14)\times10^8$ $\psi(3686)$ decays and data samples of $e^+e^-$ collisions with $\sqrt{s}$ from 4.130 to 4.780~GeV collected with the BESIII detector, we report the first observation of the electromagnetic Dalitz transition $h_c\to e^+e^-\eta_c$ with a statistical significance of $5.4\sigma$. We measure the ratio of the branching fractions $\frac{\mathcal{B}(h_c\rightarrow e^+e^-\eta_c)}{\mathcal{B}(h_c\rightarrow \gamma \eta_c)}$ separately for the $h_c$ samples produced via $\psi(3686)\to\pi^0h_c$ and $e^+e^-\to\pi^+\pi^-h_c$. The average ratio is determined to be $(0.59\pm0.10(\text{stat.})\pm0.04(\text{syst.}))\%$, where the uncertainty includes both statistical and systematic components.
- Published
- 2024
23. Three-Stream Temporal-Shift Attention Network Based on Self-Knowledge Distillation for Micro-Expression Recognition
- Author
-
Zhu, Guanghao, Liu, Lin, Hu, Yuhao, Sun, Haixin, Liu, Fang, Du, Xiaohui, Hao, Ruqian, Liu, Juanxiu, Liu, Yong, Deng, Hao, and Zhang, Jing
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Micro-expressions are subtle facial movements that occur spontaneously when people try to conceal real emotions. Micro-expression recognition is crucial in many fields, including criminal analysis and psychotherapy. However, micro-expression recognition is challenging since micro-expressions have low intensity and public datasets are small in size. To this end, a three-stream temporal-shift attention network based on self-knowledge distillation called SKD-TSTSAN is proposed in this paper. Firstly, to address the low intensity of muscle movements, we utilize learning-based motion magnification modules to enhance the intensity of muscle movements. Secondly, we employ efficient channel attention modules in the local-spatial stream to make the network focus on facial regions that are highly relevant to micro-expressions. In addition, temporal shift modules are used in the dynamic-temporal stream, which enables temporal modeling with no additional parameters by mixing motion information from two different temporal domains. Furthermore, we introduce self-knowledge distillation into the micro-expression recognition task by introducing auxiliary classifiers and using the deepest section of the network for supervision, encouraging all blocks to fully explore the features of the training set. Finally, extensive experiments are conducted on four public datasets: CASME II, SAMM, MMEW, and CAS(ME)3. The experimental results demonstrate that our SKD-TSTSAN outperforms other existing methods and achieves new state-of-the-art performance. Our code will be available at https://github.com/GuanghaoZhu663/SKD-TSTSAN.
- Published
- 2024
24. Low-Voltage Electron Emission by Graphene-hBN-graphene Heterostructure
- Author
-
Wang, Zhexuan, Liu, Fang, Cui, Kaiyu, Feng, Xue, Zhang, Wei, and Huang, Yidong
- Subjects
Physics - Applied Physics ,Condensed Matter - Mesoscale and Nanoscale Physics - Abstract
Scanning Electron Microscopes (SEM) with low energy electron sources (accelerating voltage of less than 1000V) have important application requirements in many application scenarios. Tunneling junction can potentially achieve low-voltage and planar-type electron sources with good emission current density. However, further lower the extracting voltage while ensure the emission current density remains challenging. In this paper, we report a low-voltage planar-type electron source based on graphene-hBN-graphene heterostructures (GBGH) under a really low out-plane extracting voltage. The external electric field strength applied to the electron sources is only 4 times 10^4V/m and the accelerating voltage as low as 20V is realized. Steady electron emission of over 1nA and operating duration of several hours is observed from the GBGH with size of 59.29um^2 in our experiments, and thus the maximum emission current density reaches 7mA/cm^2. Great electrical contacts, extremely low thickness, and excellent layer properties of two-dimensional (2D) materials lead to easy-fabrication and miniature on-chip electron sources, which would significantly contribute to the development of next-generation free electron devices.
- Published
- 2024
25. A Survey of Retrieval Algorithms in Ad and Content Recommendation Systems
- Author
-
Zhao, Yu and Liu, Fang
- Subjects
Computer Science - Information Retrieval ,Computer Science - Artificial Intelligence - Abstract
This survey examines the most effective retrieval algorithms utilized in ad recommendation and content recommendation systems. Ad targeting algorithms rely on detailed user profiles and behavioral data to deliver personalized advertisements, thereby driving revenue through targeted placements. Conversely, organic retrieval systems aim to improve user experience by recommending content that matches user preferences. This paper compares these two applications and explains the most effective methods employed in each.
- Published
- 2024
26. Technique Report of CVPR 2024 PBDL Challenges
- Author
-
Fu, Ying, Li, Yu, You, Shaodi, Shi, Boxin, Chen, Linwei, Zou, Yunhao, Wang, Zichun, Li, Yichen, Han, Yuze, Zhang, Yingkai, Wang, Jianan, Liu, Qinglin, Yu, Wei, Lv, Xiaoqian, Li, Jianing, Zhang, Shengping, Ji, Xiangyang, Chen, Yuanpei, Zhang, Yuhan, Peng, Weihang, Zhang, Liwen, Xu, Zhe, Gou, Dingyong, Li, Cong, Xu, Senyan, Zhang, Yunkang, Jiang, Siyuan, Lu, Xiaoqiang, Jiao, Licheng, Liu, Fang, Liu, Xu, Li, Lingling, Ma, Wenping, Yang, Shuyuan, Xie, Haiyang, Zhao, Jian, Huang, Shihua, Cheng, Peng, Shen, Xi, Wang, Zheng, An, Shuai, Zhu, Caizhi, Li, Xuelong, Zhang, Tao, Li, Liang, Liu, Yu, Yan, Chenggang, Zhang, Gengchen, Jiang, Linyan, Song, Bingyi, An, Zhuoyu, Lei, Haibo, Luo, Qing, Song, Jie, Liu, Yuan, Li, Qihang, Zhang, Haoyuan, Wang, Lingfeng, Chen, Wei, Luo, Aling, Li, Cheng, Cao, Jun, Chen, Shu, Dou, Zifei, Liu, Xinyu, Zhang, Jing, Zhang, Kexin, Yang, Yuting, Gou, Xuejian, Wang, Qinliang, Liu, Yang, Zhao, Shizhan, Zhang, Yanzhao, Yan, Libo, Guo, Yuwei, Li, Guoxin, Gao, Qiong, Che, Chenyue, Sun, Long, Chen, Xiang, Li, Hao, Pan, Jinshan, Xie, Chuanlong, Chen, Hongming, Li, Mingrui, Deng, Tianchen, Huang, Jingwei, Li, Yufeng, Wan, Fei, Xu, Bingxin, Cheng, Jian, Liu, Hongzhe, Xu, Cheng, Zou, Yuxiang, Pan, Weiguo, Dai, Songyin, Jia, Sen, Zhang, Junpei, and Chen, Puhua
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
The intersection of physics-based vision and deep learning presents an exciting frontier for advancing computer vision technologies. By leveraging the principles of physics to inform and enhance deep learning models, we can develop more robust and accurate vision systems. Physics-based vision aims to invert the processes to recover scene properties such as shape, reflectance, light distribution, and medium properties from images. In recent years, deep learning has shown promising improvements for various vision tasks, and when combined with physics-based vision, these approaches can enhance the robustness and accuracy of vision systems. This technical report summarizes the outcomes of the Physics-Based Vision Meets Deep Learning (PBDL) 2024 challenge, held in CVPR 2024 workshop. The challenge consisted of eight tracks, focusing on Low-Light Enhancement and Detection as well as High Dynamic Range (HDR) Imaging. This report details the objectives, methodologies, and results of each track, highlighting the top-performing solutions and their innovative approaches., Comment: CVPR 2024 PBDL Challenges: https://pbdl-ws.github.io/pbdl2024/challenge/index.html
- Published
- 2024
27. Multiplane Prior Guided Few-Shot Aerial Scene Rendering
- Author
-
Gao, Zihan, Jiao, Licheng, Li, Lingling, Liu, Xu, Liu, Fang, Chen, Puhua, and Guo, Yuwei
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Neural Radiance Fields (NeRF) have been successfully applied in various aerial scenes, yet they face challenges with sparse views due to limited supervision. The acquisition of dense aerial views is often prohibitive, as unmanned aerial vehicles (UAVs) may encounter constraints in perspective range and energy constraints. In this work, we introduce Multiplane Prior guided NeRF (MPNeRF), a novel approach tailored for few-shot aerial scene rendering-marking a pioneering effort in this domain. Our key insight is that the intrinsic geometric regularities specific to aerial imagery could be leveraged to enhance NeRF in sparse aerial scenes. By investigating NeRF's and Multiplane Image (MPI)'s behavior, we propose to guide the training process of NeRF with a Multiplane Prior. The proposed Multiplane Prior draws upon MPI's benefits and incorporates advanced image comprehension through a SwinV2 Transformer, pre-trained via SimMIM. Our extensive experiments demonstrate that MPNeRF outperforms existing state-of-the-art methods applied in non-aerial contexts, by tripling the performance in SSIM and LPIPS even with three views available. We hope our work offers insights into the development of NeRF-based applications in aerial scenes with limited data., Comment: 17 pages, 8 figures, accepted at CVPR 2024
- Published
- 2024
28. Explainable Machine Learning Identification of Superconductivity from Single-Particle Spectral Functions
- Author
-
Chen, Xu, Sun, Yuanjie, Hruska, Eugen, Dixit, Vivek, Yang, Jinming, He, Yu, Wang, Yao, and Liu, Fang
- Subjects
Condensed Matter - Superconductivity ,Condensed Matter - Strongly Correlated Electrons - Abstract
The traditional method of identifying symmetry-breaking phase transitions through the emergence of a single-particle gap encounters significant challenges in quantum materials with strong fluctuations. To address this, we have developed a data-driven approach using a domain-adversarial neural network trained on simulated spectra of cuprates. This model compensates for the scarcity of experimental data -- a significant barrier to the wide deployment of machine learning in physical research -- by leveraging the abundance of theoretically simulated data. When applied to unlabeled experimental spectra, our model successfully distinguishes the true superconducting states from gapped fluctuating states, without the need for fine temperature sampling across the transition. Further, the explanation of our machine learning model reveals the crucial role of the Fermi-surface spectral intensity even in gapped states. It paves the way for robust and direct spectroscopic identification of fluctuating orders, particularly in low-dimensional, strongly correlated materials., Comment: 8 pages, 5 figures
- Published
- 2024
29. A novel measurement method for SiPM external crosstalk probability at low temperature
- Author
-
Li, Guanda, Wang, Lei, Sun, Xilei, Liu, Fang, Guo, Cong, Zhao, Kangkang, Tian, Lei, Yu, Zeyuan, Hou, Zhilong, Li, Chi, Lei, Yu, Wang, Bin, and Zhou, Rongbin
- Subjects
Physics - Instrumentation and Detectors ,Nuclear Experiment - Abstract
Silicon photomultipliers (SiPMs) are being considered as potential replacements for conventional photomultiplier tubes (PMTs). However, a significant disadvantage of SiPMs is crosstalk (CT), wherein photons propagate through other pixels, resulting in secondary avalanches. CT can be categorized into internal crosstalk and external crosstalk based on whether the secondary avalanche occurs within the same SiPM or a different one. Numerous methods exist for quantitatively estimating the percentage of internal crosstalk (iCT). However, external crosstalk (eCT) has not been extensively studied. This article presents a novel measurement method for the probability of emitting an external crosstalk photon during a single pixel avalanche, using a setup involving two identical SiPMs facing each other, and without the need for complex optical designs. The entire apparatus is enclosed within a stainless steel chamber, functioning as a light-tight enclosure, and maintained at liquid nitrogen temperature. The experimental setup incorporates two Sensl J-60035 SiPM chips along with two 0.5-inch Hamamatsu Photonics (HPK) VUV4 S13370-6050CN SiPM arrays. The findings show a linear relationship between the probability of emitting an external crosstalk photon and the SiPM overvoltage for both SiPM samples. Surprisingly, this novel measurement method also rovides measurements of the SiPM photon detection efficiency (PDE) for eCT photons at low temperature.
- Published
- 2024
30. Automatic Graph Topology-Aware Transformer
- Author
-
Wang, Chao, Zhao, Jiaxuan, Li, Lingling, Jiao, Licheng, Liu, Fang, and Yang, Shuyuan
- Subjects
Computer Science - Neural and Evolutionary Computing ,Computer Science - Graphics ,Computer Science - Machine Learning - Abstract
Existing efforts are dedicated to designing many topologies and graph-aware strategies for the graph Transformer, which greatly improve the model's representation capabilities. However, manually determining the suitable Transformer architecture for a specific graph dataset or task requires extensive expert knowledge and laborious trials. This paper proposes an evolutionary graph Transformer architecture search framework (EGTAS) to automate the construction of strong graph Transformers. We build a comprehensive graph Transformer search space with the micro-level and macro-level designs. EGTAS evolves graph Transformer topologies at the macro level and graph-aware strategies at the micro level. Furthermore, a surrogate model based on generic architectural coding is proposed to directly predict the performance of graph Transformers, substantially reducing the evaluation cost of evolutionary search. We demonstrate the efficacy of EGTAS across a range of graph-level and node-level tasks, encompassing both small-scale and large-scale graph datasets. Experimental results and ablation studies show that EGTAS can construct high-performance architectures that rival state-of-the-art manual and automated baselines., Comment: This work has been accepted by IEEE Transactions on Neural Networks and Learning Systems
- Published
- 2024
31. JUNO Sensitivity to Invisible Decay Modes of Neutrons
- Author
-
JUNO Collaboration, Abusleme, Angel, Adam, Thomas, Adamowicz, Kai, Ahmad, Shakeel, Ahmed, Rizwan, Aiello, Sebastiano, An, Fengpeng, An, Qi, Andronico, Giuseppe, Anfimov, Nikolay, Antonelli, Vito, Antoshkina, Tatiana, de André, João Pedro Athayde Marcondes, Auguste, Didier, Bai, Weidong, Balashov, Nikita, Baldini, Wander, Barresi, Andrea, Basilico, Davide, Baussan, Eric, Bellato, Marco, Beretta, Marco, Bergnoli, Antonio, Bick, Daniel, Bieger, Lukas, Biktemerova, Svetlana, Birkenfeld, Thilo, Blake, Iwan, Blyth, Simon, Bolshakova, Anastasia, Bongrand, Mathieu, Breton, Dominique, Brigatti, Augusto, Brugnera, Riccardo, Bruno, Riccardo, Budano, Antonio, Busto, Jose, Cabrera, Anatael, Caccianiga, Barbara, Cai, Hao, Cai, Xiao, Cai, Yanke, Cai, Zhiyan, Callier, Stéphane, Calvez, Steven, Cammi, Antonio, Campeny, Agustin, Cao, Chuanya, Cao, Guofu, Cao, Jun, Caruso, Rossella, Cerna, Cédric, Cerrone, Vanessa, Chang, Jinfan, Chang, Yun, Chatrabhuti, Auttakit, Chen, Chao, Chen, Guoming, Chen, Pingping, Chen, Shaomin, Chen, Xin, Chen, Yiming, Chen, Yixue, Chen, Yu, Chen, Zelin, Chen, Zhangming, Chen, Zhiyuan, Chen, Zikang, Cheng, Jie, Cheng, Yaping, Cheng, Yu Chin, Chepurnov, Alexander, Chetverikov, Alexey, Chiesa, Davide, Chimenti, Pietro, Chin, Yen-Ting, Chou, Po-Lin, Chu, Ziliang, Chukanov, Artem, Claverie, Gérard, Clementi, Catia, Clerbaux, Barbara, Molla, Marta Colomer, Di Lorenzo, Selma Conforti, Coppi, Alberto, Corti, Daniele, Csakli, Simon, Cui, Chenyang, Corso, Flavio Dal, Dalager, Olivia, Datta, Jaydeep, De La Taille, Christophe, Deng, Zhi, Deng, Ziyan, Ding, Xiaoyu, Ding, Xuefeng, Ding, Yayun, Dirgantara, Bayu, Dittrich, Carsten, Dmitrievsky, Sergey, Dohnal, Tadeas, Dolzhikov, Dmitry, Donchenko, Georgy, Dong, Jianmeng, Doroshkevich, Evgeny, Dou, Wei, Dracos, Marcos, Druillole, Frédéric, Du, Ran, Du, Shuxian, Duan, Yujie, Dugas, Katherine, Dusini, Stefano, Duyang, Hongyue, Eck, Jessica, Enqvist, Timo, Fabbri, Andrea, Fahrendholz, Ulrike, Fan, Lei, Fang, Jian, Fang, Wenxing, Fedoseev, Dmitry, Feng, Li-Cheng, Feng, Qichun, Ferraro, Federico, Fournier, Amélie, Fritsch, Fritsch, Gan, Haonan, Gao, Feng, Garfagnini, Alberto, Gavrikov, Arsenii, Giammarchi, Marco, Giudice, Nunzio, Gonchar, Maxim, Gong, Guanghua, Gong, Hui, Gornushkin, Yuri, Grassi, Marco, Gromov, Maxim, Gromov, Vasily, Gu, Minghao, Gu, Xiaofei, Gu, Yu, Guan, Mengyun, Guan, Yuduo, Guardone, Nunzio, Guizzetti, Rosa Maria, Guo, Cong, Guo, Wanlei, Hagner, Caren, Han, Hechong, Han, Ran, Han, Yang, He, Jinhong, He, Miao, He, Wei, He, Xinhai, Heinz, Tobias, Hellmuth, Patrick, Heng, Yuekun, Herrera, Rafael, Hor, YuenKeung, Hou, Shaojing, Hsiung, Yee, Hu, Bei-Zhen, Hu, Hang, Hu, Jun, Hu, Peng, Hu, Shouyang, Hu, Tao, Hu, Yuxiang, Hu, Zhuojun, Huang, Guihong, Huang, Hanxiong, Huang, Jinhao, Huang, Junting, Huang, Kaixuan, Huang, Shengheng, Huang, Wenhao, Huang, Xin, Huang, Xingtao, Huang, Yongbo, Hui, Jiaqi, Huo, Lei, Huo, Wenju, Huss, Cédric, Hussain, Safeer, Imbert, Leonard, Ioannisian, Ara, Isocrate, Roberto, Jafar, Arshak, Jelmini, Beatrice, Jeria, Ignacio, Ji, Xiaolu, Jia, Huihui, Jia, Junji, Jian, Siyu, Jiang, Cailian, Jiang, Di, Jiang, Guangzheng, Jiang, Wei, Jiang, Xiaoshan, Jiang, Xiaozhao, Jiang, Yixuan, Jing, Xiaoping, Jollet, Cécile, Kang, Li, Karaparabil, Rebin, Kazarian, Narine, Khan, Ali, Khatun, Amina, Khosonthongkee, Khanchai, Korablev, Denis, Kouzakov, Konstantin, Krasnoperov, Alexey, Kuleshov, Sergey, Kumaran, Sindhujha, Kutovskiy, Nikolay, Labit, Loïc, Lachenmaier, Tobias, Lai, Haojing, Landini, Cecilia, Leblanc, Sébastien, Lefevre, Frederic, Lei, Ruiting, Leitner, Rupert, Leung, Jason, Li, Demin, Li, Fei, Li, Fule, Li, Gaosong, Li, Hongjian, Li, Huang, Li, Jiajun, Li, Min, Li, Nan, Li, Qingjiang, Li, Ruhui, Li, Rui, Li, Shanfeng, Li, Shuo, Li, Tao, Li, Teng, Li, Weidong, Li, Weiguo, Li, Xiaomei, Li, Xiaonan, Li, Xinglong, Li, Yi, Li, Yichen, Li, Yufeng, Li, Zhaohan, Li, Zhibing, Li, Ziyuan, Li, Zonghai, Liang, An-An, Liang, Hao, Liao, Jiajun, Liao, Yilin, Liao, Yuzhong, Limphirat, Ayut, Lin, Guey-Lin, Lin, Shengxin, Lin, Tao, Ling, Jiajie, Ling, Xin, Lippi, Ivano, Liu, Caimei, Liu, Fang, Liu, Fengcheng, Liu, Haidong, Liu, Haotian, Liu, Hongbang, Liu, Hongjuan, Liu, Hongtao, Liu, Hongyang, Liu, Jianglai, Liu, Jiaxi, Liu, Jinchang, Liu, Min, Liu, Qian, Liu, Qin, Liu, Runxuan, Liu, Shenghui, Liu, Shubin, Liu, Shulin, Liu, Xiaowei, Liu, Xiwen, Liu, Xuewei, Liu, Yankai, Liu, Zhen, Loi, Lorenzo, Lokhov, Alexey, Lombardi, Paolo, Lombardo, Claudio, Loo, Kai, Lu, Chuan, Lu, Haoqi, Lu, Jingbin, Lu, Junguang, Lu, Meishu, Lu, Peizhi, Lu, Shuxiang, Lu, Xianguo, Lubsandorzhiev, Bayarto, Lubsandorzhiev, Sultim, Ludhova, Livia, Lukanov, Arslan, Luo, Fengjiao, Luo, Guang, Luo, Jianyi, Luo, Shu, Luo, Wuming, Luo, Xiaojie, Lyashuk, Vladimir, Ma, Bangzheng, Ma, Bing, Ma, Qiumei, Ma, Si, Ma, Xiaoyan, Ma, Xubo, Maalmi, Jihane, Mai, Jingyu, Malabarba, Marco, Malyshkin, Yury, Mandujano, Roberto Carlos, Mantovani, Fabio, Mao, Xin, Mao, Yajun, Mari, Stefano M., Marini, Filippo, Martini, Agnese, Mayer, Matthias, Mayilyan, Davit, Mednieks, Ints, Meng, Yue, Meraviglia, Anita, Meregaglia, Anselmo, Meroni, Emanuela, Miramonti, Lino, Mohan, Nikhil, Montuschi, Michele, Reveco, Cristobal Morales, Nastasi, Massimiliano, Naumov, Dmitry V., Naumova, Elena, Navas-Nicolas, Diana, Nemchenok, Igor, Thi, Minh Thuan Nguyen, Nikolaev, Alexey, Ning, Feipeng, Ning, Zhe, Nunokawa, Hiroshi, Oberauer, Lothar, Ochoa-Ricoux, Juan Pedro, Olshevskiy, Alexander, Orestano, Domizia, Ortica, Fausto, Othegraven, Rainer, Paoloni, Alessandro, Parker, George, Parmeggiano, Sergio, Patsias, Achilleas, Pei, Yatian, Pelicci, Luca, Peng, Anguo, Peng, Haiping, Peng, Yu, Peng, Zhaoyuan, Percalli, Elisa, Perrin, Willy, Perrot, Frédéric, Petitjean, Pierre-Alexandre, Petrucci, Fabrizio, Pilarczyk, Oliver, Rico, Luis Felipe Piñeres, Popov, Artyom, Poussot, Pascal, Previtali, Ezio, Qi, Fazhi, Qi, Ming, Qi, Xiaohui, Qian, Sen, Qian, Xiaohui, Qian, Zhen, Qiao, Hao, Qin, Zhonghua, Qiu, Shoukang, Qu, Manhao, Qu, Zhenning, Ranucci, Gioacchino, Re, Alessandra, Rebii, Abdel, Redchuk, Mariia, Reina, Gioele, Ren, Bin, Ren, Jie, Ren, Yuhan, Ricci, Barbara, Rientong, Komkrit, Rifai, Mariam, Roche, Mathieu, Rodphai, Narongkiat, Romani, Aldo, Roskovec, Bedřich, Ruan, Xichao, Rybnikov, Arseniy, Sadovsky, Andrey, Saggese, Paolo, Sandanayake, Deshan, Sangka, Anut, Sava, Giuseppe, Sawangwit, Utane, Schever, Michaela, Schwab, Cédric, Schweizer, Konstantin, Selyunin, Alexandr, Serafini, Andrea, Settimo, Mariangela, Shao, Junyu, Sharov, Vladislav, Shi, Hexi, Shi, Jingyan, Shi, Yanan, Shutov, Vitaly, Sidorenkov, Andrey, Šimkovic, Fedor, Singhal, Apeksha, Sirignano, Chiara, Siripak, Jaruchit, Sisti, Monica, Smirnov, Mikhail, Smirnov, Oleg, Sokolov, Sergey, Songwadhana, Julanan, Soonthornthum, Boonrucksar, Sotnikov, Albert, Sreethawong, Warintorn, Stahl, Achim, Stanco, Luca, Stankevich, Konstantin, Steiger, Hans, Steinmann, Jochen, Sterr, Tobias, Stock, Matthias Raphael, Strati, Virginia, Strizh, Michail, Studenikin, Alexander, Su, Aoqi, Su, Jun, Sun, Guangbao, Sun, Shifeng, Sun, Xilei, Sun, Yongjie, Sun, Yongzhao, Sun, Zhengyang, Suwonjandee, Narumon, Takenaka, Akira, Tan, Xiaohan, Tang, Jian, Tang, Jingzhe, Tang, Qiang, Tang, Quan, Tang, Xiao, Hariharan, Vidhya Thara, Tkachev, Igor, Tmej, Tomas, Torri, Marco Danilo Claudio, Triossi, Andrea, Trzaska, Wladyslaw, Tung, Yu-Chen, Tuve, Cristina, Ushakov, Nikita, Vedin, Vadim, Venettacci, Carlo, Verde, Giuseppe, Vialkov, Maxim, Viaud, Benoit, Vollbrecht, Cornelius Moritz, von Sturm, Katharina, Vorobel, Vit, Voronin, Dmitriy, Votano, Lucia, Walker, Pablo, Wang, Caishen, Wang, Chung-Hsiang, Wang, En, Wang, Guoli, Wang, Hanwen, Wang, Jian, Wang, Jun, Wang, Li, Wang, Lu, Wang, Meng, Wang, Mingyuan, Wang, Qianchuan, Wang, Ruiguang, Wang, Sibo, Wang, Siguang, Wang, Wei, Wang, Wenshuai, Wang, Xi, Wang, Xiangyue, Wang, Yangfu, Wang, Yaoguang, Wang, Yi, Wang, Yifang, Wang, Yuanqing, Wang, Yuyi, Wang, Zhe, Wang, Zheng, Wang, Zhimin, Watcharangkool, Apimook, Wei, Wei, Wei, Wenlu, Wei, Yadong, Wei, Yuehuan, Wen, Liangjian, Weng, Jun, Wiebusch, Christopher, Wirth, Rosmarie, Wu, Chengxin, Wu, Diru, Wu, Qun, Wu, Yinhui, Wu, Yiyang, Wu, Zhi, Wurm, Michael, Wurtz, Jacques, Wysotzki, Christian, Xi, Yufei, Xia, Dongmei, Xian, Shishen, Xiang, Ziqian, Xiao, Fei, Xiao, Xiang, Xie, Xiaochuan, Xie, Yijun, Xie, Yuguang, Xin, Zhao, Xing, Zhizhong, Xu, Benda, Xu, Cheng, Xu, Donglian, Xu, Fanrong, Xu, Hangkun, Xu, Jiayang, Xu, Jilei, Xu, Jing, Xu, Jinghuan, Xu, Meihang, Xu, Xunjie, Xu, Yin, Xu, Yu, Yan, Baojun, Yan, Qiyu, Yan, Taylor, Yan, Xiongbo, Yan, Yupeng, Yang, Changgen, Yang, Chengfeng, Yang, Fengfan, Yang, Jie, Yang, Lei, Yang, Pengfei, Yang, Xiaoyu, Yang, Yifan, Yang, Yixiang, Yang, Zekun, Yao, Haifeng, Ye, Jiaxuan, Ye, Mei, Ye, Ziping, Yermia, Frédéric, You, Zhengyun, Yu, Boxiang, Yu, Chiye, Yu, Chunxu, Yu, Guojun, Yu, Hongzhao, Yu, Miao, Yu, Xianghui, Yu, Zeyuan, Yu, Zezhong, Yuan, Cenxi, Yuan, Chengzhuo, Yuan, Ying, Yuan, Zhenxiong, Yue, Baobiao, Zafar, Noman, Zamogilnyi, Kirill, Zavadskyi, Vitalii, Zeng, Fanrui, Zeng, Shan, Zeng, Tingxuan, Zeng, Yuda, Zhan, Liang, Zhang, Aiqiang, Zhang, Bin, Zhang, Binting, Zhang, Feiyang, Zhang, Hangchang, Zhang, Haosen, Zhang, Honghao, Zhang, Jialiang, Zhang, Jiawen, Zhang, Jie, Zhang, Jingbo, Zhang, Jinnan, Zhang, Junwei, Zhang, Lei, Zhang, Peng, Zhang, Ping, Zhang, Qingmin, Zhang, Shiqi, Zhang, Shu, Zhang, Shuihan, Zhang, Siyuan, Zhang, Tao, Zhang, Xiaomei, Zhang, Xin, Zhang, Xuantong, Zhang, Yibing, Zhang, Yinhong, Zhang, Yiyu, Zhang, Yongpeng, Zhang, Yu, Zhang, Yuanyuan, Zhang, Yumei, Zhang, Zhenyu, Zhang, Zhijian, Zhao, Jie, Zhao, Rong, Zhao, Runze, Zhao, Shujun, Zhao, Tianhao, Zheng, Hua, Zheng, Yangheng, Zhou, Jing, Zhou, Li, Zhou, Nan, Zhou, Shun, Zhou, Tong, Zhou, Xiang, Zhou, Xing, Zhu, Jingsen, Zhu, Kangfu, Zhu, Kejun, Zhu, Zhihang, Zhuang, Bo, Zhuang, Honglin, Zong, Liang, and Zou, Jiaheng
- Subjects
High Energy Physics - Experiment ,High Energy Physics - Phenomenology - Abstract
We explore the bound neutrons decay into invisible particles (e.g., $n\rightarrow 3 \nu$ or $nn \rightarrow 2 \nu$) in the JUNO liquid scintillator detector. The invisible decay includes two decay modes: $ n \rightarrow { inv} $ and $ nn \rightarrow { inv} $. The invisible decays of $s$-shell neutrons in $^{12}{\rm C}$ will leave a highly excited residual nucleus. Subsequently, some de-excitation modes of the excited residual nuclei can produce a time- and space-correlated triple coincidence signal in the JUNO detector. Based on a full Monte Carlo simulation informed with the latest available data, we estimate all backgrounds, including inverse beta decay events of the reactor antineutrino $\bar{\nu}_e$, natural radioactivity, cosmogenic isotopes and neutral current interactions of atmospheric neutrinos. Pulse shape discrimination and multivariate analysis techniques are employed to further suppress backgrounds. With two years of exposure, JUNO is expected to give an order of magnitude improvement compared to the current best limits. After 10 years of data taking, the JUNO expected sensitivities at a 90% confidence level are $\tau/B( n \rightarrow { inv} ) > 5.0 \times 10^{31} \, {\rm yr}$ and $\tau/B( nn \rightarrow { inv} ) > 1.4 \times 10^{32} \, {\rm yr}$., Comment: 28 pages, 7 figures, 4 tables
- Published
- 2024
32. Non-uniform dependence on initial data for the generalized Camassa-Holm equation in $C^1$
- Author
-
Yu, Yanghai and Liu, Fang
- Subjects
Mathematics - Analysis of PDEs - Abstract
It is shown in \cite[Adv. Differ. Equ(2017)]{HT} that the Cauchy problem for the generalized Camassa-Holm equation is well-posed in $C^1$ and the data-to-solution map is H\"{o}lder continuous from $C^\alpha$ to $\mathcal{C}([0,T];C^\alpha)$ with $\alpha\in[0,1)$. In this paper, we further show that the data-to-solution map of the generalized Camassa-Holm equation is not uniformly continuous on the initial data in $C^1$. In particular, our result also can be a complement of previous work on the classical Camassa-Holm equation in \cite[Geom. Funct. Anal(2002)]{G02}.
- Published
- 2024
33. Precision measurement of the branching fraction of \boldmath $J/\psi\rightarrow K^+K^-$ via $\psi(2S)\rightarrow \pi^+\pi^-J/\psi$
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Ai, X. C., Aliberti, R., Amoroso, A., An, M. R., An, Q., Bai, Y., Bakina, O., Balossino, I., Ban, Y., Bao, H. -R., Batozskaya, V., Begzsuren, K., Berger, N., Berlowski, M., Bertani, M., Bettoni, D., Bianchi, F., Bianco, E., Bortone, A., Boyko, I., Briere, R. A., Brueggemann, A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Chang, W. L., Che, G. R., Chelkov, G., Chen, C., Chen, Chao, Chen, G., Chen, H. S., Chen, M. L., Chen, S. J., Chen, S. L., Chen, S. M., Chen, T., Chen, X. R., Chen, X. T., Chen, Y. B., Chen, Y. Q., Chen, Z. J., Choi, S. K., Chu, X., Cibinetto, G., Coen, S. C., Cossio, F., Cui, J. J., Dai, H. L., Dai, J. P., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, B., Ding, X. X., Ding, Y., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, M. C., Du, S. X., Duan, Z. H., Egorov, P., Fan, Y. H., Fang, J., Fang, S. S., Fang, W. X., Fang, Y., Fang, Y. Q., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Feng, Y. T., Fischer, K, Fritsch, M., Fu, C. D., Fu, J. L., Fu, Y. W., Gao, H., Gao, Y. N., Gao, Yang, Garbolino, S., Garzia, I., Ge, P. T., Ge, Z. W., Geng, C., Gersabeck, E. M., Gilman, A, Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Gramigna, S., Greco, M., Gu, M. H., Gu, Y. T., Guan, C. Y, Guan, Z. L., Guo, A. Q., Guo, L. B., Guo, M. J., Guo, R. P., Guo, Y. P., Guskov, A., Gutierrez, J., Han, K. L., Han, T. T., Han, W. Y., Hao, X. Q., Harris, F. A., He, K. K., He, K. L., Heinsius, F. H. H., Heinz, C. H., Heng, Y. K., Herold, C., Holtmann, T., Hong, P. C., Hou, G. Y., Hou, X. T., Hou, Y. R., Hou, Z. L., Hu, B. Y., Hu, H. M., Hu, J. F., Hu, T., Hu, Y., Huang, G. S., Huang, K. X., Huang, L. Q., Huang, X. T., Huang, Y. P., Hussain, T., Hüsken, N, der Wiesche, N. in, Irshad, M., Jackson, J., Jaeger, S., Janchiv, S., Jeong, J. H., Ji, Q., Ji, Q. P., Ji, X. B., Ji, X. L., Ji, Y. Y., Jia, X. Q., Jia, Z. K., Jiang, H. B., Jiang, P. C., Jiang, S. S., Jiang, T. J., Jiang, X. S., Jiang, Y., Jiao, J. B., Jiao, Z., Jin, S., Jin, Y., Jing, M. Q., Jing, X. M., Johansson, T., K., X., Kabana, S., Kalantar-Nayestanaki, N., Kang, X. L., Kang, X. S., Kavatsyuk, M., Ke, B. C., Khachatryan, V., Khoukaz, A., Kiuchi, R., Kolcu, O. B., Kopf, B., Kuessner, M., Kupsc, A., Kühn, W., Lane, J. J., Larin, P., Lavezzi, L., Lei, T. T., Lei, Z. H., Leithoff, H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H., Li, H. B., Li, H. J., Li, H. N., Li, Hui, Li, J. R., Li, J. S., Li, J. W., Li, Ke, Li, L. J, Li, L. K., Li, Lei, Li, M. H., Li, P. R., Li, Q. X., Li, S. X., Li, T., Li, W. D., Li, W. G., Li, X. H., Li, X. L., Li, Xiaoyu, Li, Y. G., Li, Z. J., Li, Z. X., Liang, C., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Liao, Y. P., Libby, J., Limphirat, A., Lin, D. X., Lin, T., Liu, B. J., Liu, B. X., Liu, C., Liu, C. X., Liu, F. H., Liu, Fang, Liu, Feng, Liu, G. M., Liu, H., Liu, H. B., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, L. C., Liu, Lu, Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, T., Liu, W. K., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Lu, Z. H., Luo, C. L., Luo, M. X., Luo, T., Luo, X. L., Lyu, X. R., Lyu, Y. F., Ma, F. C., Ma, H., Ma, H. L., Ma, J. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, X. Y., Ma, Y., Ma, Y. M., Maas, F. E., Maggiora, M., Malde, S., Mangoni, A., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Miao, H., Min, T. J., Mitchell, R. E., Mo, X. H., Moses, B., Muchnoi, N. Yu., Muskalla, J., Nefedov, Y., Nerling, F., Nikolaev, I. B., Ning, Z., Nisar, S., Niu, Q. L., Niu, W. D., Niu, Y., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Patteri, P., Pei, Y. P., Pelizaeus, M., Peng, H. P., Peng, Y. Y., Peters, K., Ping, J. L., Ping, R. G., Plura, S., Prasad, V., Qi, F. Z., Qi, H., Qi, H. R., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qiao, C. F., Qin, J. J., Qin, L. Q., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, S. Q., Redmer, C. F., Ren, K. J., Rivetti, A., Rolo, M., Rong, G., Rosner, Ch., Ruan, S. N., Salone, N., Sarantsev, A., Schelhaas, Y., Schoenning, K., Scodeggio, M., Shan, K. Y., Shan, W., Shan, X. Y., Shangguan, J. F., Shao, L. G., Shao, M., Shen, C. P., Shen, H. F., Shen, W. H., Shen, X. Y., Shi, B. A., Shi, H. C., Shi, J. L., Shi, J. Y., Shi, Q. Q., Shi, R. S., Shi, X., Song, J. J., Song, T. Z., Song, W. M., Song, Y. J., Sosio, S., Spataro, S., Stieler, F., Su, Y. J., Sun, G. B., Sun, G. X., Sun, H., Sun, H. K., Sun, J. F., Sun, K., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, Y., Sun, Y. J., Sun, Y. Z., Sun, Z. T., Tan, Y. X., Tang, C. J., Tang, G. Y., Tang, J., Tang, Y. A., Tao, L. Y, Tao, Q. T., Tat, M., Teng, J. X., Thoren, V., Tian, W. H., Tian, Y., Tian, Z. F., Uman, I., Wan, Y., Wang, S. J., Wang, B., Wang, B. L., Wang, Bo, Wang, C. W., Wang, D. Y., Wang, F., Wang, H. J., Wang, J. P., Wang, K., Wang, L. L., Wang, M., Wang, Meng, Wang, N. Y., Wang, S., Wang, T., Wang, T. J., Wang, W., Wang, W. P., Wang, X., Wang, X. F., Wang, X. J., Wang, X. L., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. L., Wang, Y. N., Wang, Y. Q., Wang, Yaqian, Wang, Yi, Wang, Z., Wang, Z. L., Wang, Z. Y., Wang, Ziyi, Wei, D., Wei, D. H., Weidner, F., Wen, S. P., Wenzel, C. W., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, C., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, X. H., Wu, Y., Wu, Y. H., Wu, Y. J., Wu, Z., Xia, L., Xian, X. M., Xiang, T., Xiao, D., Xiao, G. Y., Xiao, S. Y., Xiao, Y. L., Xiao, Z. J., Xie, C., Xie, X. H., Xie, Y., Xie, Y. G., Xie, Y. H., Xie, Z. P., Xing, T. Y., Xu, C. F., Xu, C. J., Xu, G. F., Xu, H. Y., Xu, Q. J., Xu, Q. N., Xu, W., Xu, W. L., Xu, X. P., Xu, Y. C., Xu, Z. P., Xu, Z. S., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yan, X. Q., Yang, H. J., Yang, H. L., Yang, H. X., Yang, Tao, Yang, Y., Yang, Y. F., Yang, Y. X., Yang, Yifan, Yang, Z. W., Yao, Z. P., Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, J. S., Yu, T., Yu, X. D., Yuan, C. Z., Yuan, L., Yuan, S. C., Yuan, Y., Yuan, Z. Y., Yue, C. X., Zafar, A. A., Zeng, F. R., Zeng, S. H., Zeng, X., Zeng, Y., Zeng, Y. J., Zhai, X. Y., Zhai, Y. C., Zhan, Y. H., Zhang, A. Q., Zhang, B. L., Zhang, B. X., Zhang, D. H., Zhang, G. Y., Zhang, H., Zhang, H. C., Zhang, H. H., Zhang, H. Q., Zhang, H. Y., Zhang, J., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. W., Zhang, J. X., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, L. M., Zhang, L. Q., Zhang, Lei, Zhang, P., Zhang, Q. Y., Zhang, Shuihan, Zhang, Shulei, Zhang, X. D., Zhang, X. M., Zhang, X. Y., Zhang, Y., Zhang, Y. T., Zhang, Y. H., Zhang, Yan, Zhang, Yao, Zhang, Z. D., Zhang, Z. H., Zhang, Z. L., Zhang, Z. Y., Zhao, G., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, R. P., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, J. P., Zheng, W. J., Zheng, Y. H., Zhong, B., Zhong, X., Zhou, H., Zhou, L. P., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhou, Y. Z., Zhu, J., Zhu, K., Zhu, K. J., Zhu, L., Zhu, L. X., Zhu, S. H., Zhu, S. Q., Zhu, T. J., Zhu, W. J., Zhu, Y. C., Zhu, Z. A., Zou, J. H., and Zu, J.
- Subjects
High Energy Physics - Experiment - Abstract
Using a sample of $448.1 \times 10^6$ $\psi(2S)$ events collected with the BESIII detector, we perform a study of the decay $J/\psi\rightarrow K^+K^-$ via $\psi(2S)\rightarrow \pi^+\pi^-J/\psi$. The branching fraction of $J/\psi\rightarrow K^+K^-$ is determined to be $\mathcal{B}_{K^+K^-}=(3.072\pm 0.023({\rm stat.})\pm 0.050({\rm syst.}))\times 10^{-4}$, which is consistent with previous measurements but with significantly improved precision., Comment: to be submitted to PRD
- Published
- 2024
34. Electrically switchable $2^N$-channel wave-front control with N cascaded polarization-dependent metasurfaces
- Author
-
Ma, Zhiyao, Tian, Tian, Liao, Yuxuan, Feng, Xue, Li, Yongzhuo, Cui, Kaiyu, Liu, Fang, Sun, Hao, Zhang, Wei, and Huang, Yidong
- Subjects
Physics - Optics ,Physics - Applied Physics - Abstract
Metasurfaces with tunable functionalities are greatly desired for modern optical system and various applications. To increase the operating channels of polarization-multiplexed metasurfaces, we proposed a structure of N cascaded dual-channel metasurfaces to achieve 2^N electrically switchable functional channels without intrinsic noise or cross-talk. As proof of principles, we have implemented a 3-layer setup to achieve 8 channels. In success, we have demonstrated two typical functionalities of vortex beam generation with switchable topological charge of l=-3 ~ +4 or l=-1~ -8, and beam steering with the deflecting direction switchable in an 8*1 line or a 4*2 grid. We believe that our proposal would provide a practical way to significantly increase the scalability and extend the functionality of polarization-multiplexed metasurfaces, which are potential for the applications of LiDAR, glasses-free 3D display, OAM (de)multiplexing, and varifocal meta-lens.
- Published
- 2024
35. Search for the leptonic decays $D^{*+}\to e^+\nu_e$ and $D^{*+}\to \mu^+\nu_\mu$
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Albrecht, M., Aliberti, R., Amoroso, A., An, M. R., An, Q., Bai, Y., Bakina, O., Ferroli, R. Baldini, Balossino, I., Ban, Y., Batozskaya, V., Becker, D., Begzsuren, K., Berger, N., Bertani, M., Bettoni, D., Bianchi, F., Bianco, E., Bloms, J., Bortone, A., Boyko, I., Briere, R. A., Brueggemann, A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Chang, W. L., Che, G. R., Chelkov, G., Chen, C., Chen, Chao, Chen, G., Chen, H. S., Chen, M. L., Chen, S. J., Chen, S. M., Chen, T., Chen, X. R., Chen, X. T., Chen, Y. B., Chen, Z. J., Cheng, W. S., Choi, S. K., Chu, X., Cibinetto, G., Cossio, F., Cui, J. J., Dai, H. L., Dai, J. P., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, Y., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, S. X., Duan, Z. H., Egorov, P., Fan, Y. L., Fang, J., Fang, S. S., Fang, W. X., Fang, Y., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Fischer, K, Fritsch, M., Fritzsch, C., Fu, C. D., Gao, H., Gao, Y. N., Gao, Yang, Garbolino, S., Garzia, I., Ge, P. T., Ge, Z. W., Geng, C., Gersabeck, E. M., Gilman, A, Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Greco, M., Gu, L. M., Gu, M. H., Gu, Y. T., Guan, C. Y, Guo, A. Q., Guo, L. B., Guo, R. P., Guo, Y. P., Guskov, A., Han, W. Y., Hao, X. Q., Harris, F. A., He, K. K., He, K. L., Heinsius, F. H., Heinz, C. H., Heng, Y. K., Herold, C., Hou, G. Y., Hou, Y. R., Hou, Z. L., Hu, H. M., Hu, J. F., Hu, T., Hu, Y., Huang, G. S., Huang, K. X., Huang, L. Q., Huang, X. T., Huang, Y. P., Huang, Z., Huang, Z. C., Hussain, T., Hüsken, N, Imoehl, W., Irshad, M., Jackson, J., Jaeger, S., Janchiv, S., Jang, E., Jeong, J. H., Ji, Q., Ji, Q. P., Ji, X. B., Ji, X. L., Ji, Y. Y., Jia, Z. K., Jiang, P. C., Jiang, S. S., Jiang, X. S., Jiang, Y., Jiao, J. B., Jiao, Z., Jin, S., Jin, Y., Jing, M. Q., Johansson, T., Kabana, S., Kalantar-Nayestanaki, N., Kang, X. L., Kang, X. S., Kappert, R., Kavatsyuk, M., Ke, B. C., Keshk, I. K., Khoukaz, A., Kiuchi, R., Kliemt, R., Koch, L., Kolcu, O. B., Kopf, B., Kuemmel, M., Kuessner, M., Kupsc, A., Kühn, W., Lane, J. J., Lange, J. S., Larin, P., Lavania, A., Lavezzi, L., Lei, T. T., Lei, Z. H., Leithoff, H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H., Li, H. B., Li, H. J., Li, H. N., Li, Hui, Li, J. Q., Li, J. S., Li, J. W., Li, Ke, Li, L. J, Li, L. K., Li, Lei, Li, M. H., Li, P. R., Li, S. X., Li, S. Y., Li, T., Li, W. D., Li, W. G., Li, X. H., Li, X. L., Li, Xiaoyu, Li, Y. G., Li, Z. X., Li, Z. Y., Liang, C., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Libby, J., Limphirat, A., Lin, C. X., Lin, D. X., Lin, T., Liu, B. J., Liu, C., Liu, C. X., Liu, D., Liu, F. H., Liu, Fang, Liu, Feng, Liu, G. M., Liu, H., Liu, H. B., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. L., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, Lu, Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, T., Liu, W. K., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Lu, Z. H., Luo, C. L., Luo, M. X., Luo, T., Luo, X. L., Lyu, X. R., Lyu, Y. F., Ma, F. C., Ma, H. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, R. T., Ma, X. Y., Ma, Y., Maas, F. E., Maggiora, M., Maldaner, S., Malde, S., Malik, Q. A., Mangoni, A., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Miao, H., Min, T. J., Mitchell, R. E., Mo, X. H., Muchnoi, N. Yu., Nefedov, Y., Nerling, F., Nikolaev, I. B., Ning, Z., Nisar, S., Niu, Y., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Pei, Y. P., Pelizaeus, M., Peng, H. P., Peters, K., Ping, J. L., Ping, R. G., Plura, S., Pogodin, S., Prasad, V., Qi, F. Z., Qi, H., Qi, H. R., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qian, Z., Qiao, C. F., Qin, J. J., Qin, L. Q., Qin, X. P., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, S. Q., Rashid, K. H., Redmer, C. F., Ren, K. J., Rivetti, A., Rodin, V., Rolo, M., Rong, G., Rosner, Ch., Ruan, S. N., Sarantsev, A., Schelhaas, Y., Schnier, C., Schoenning, K., Scodeggio, M., Shan, K. Y., Shan, W., Shan, X. Y., Shangguan, J. F., Shao, L. G., Shao, M., Shen, C. P., Shen, H. F., Shen, W. H., Shen, X. Y., Shi, B. A., Shi, H. C., Shi, J. Y., Shi, q. q., Shi, R. S., Shi, X., Song, J. J., Song, W. M., Song, Y. X., Sosio, S., Spataro, S., Stieler, F., Su, P. P., Su, Y. J., Sun, G. X., Sun, H., Sun, H. K., Sun, J. F., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, Y. J., Sun, Y. Z., Sun, Z. T., Tan, Y. X., Tang, C. J., Tang, G. Y., Tang, J., Tao, L. Y, Tao, Q. T., Tat, M., Teng, J. X., Thoren, V., Tian, W. H., Tian, Y., Uman, I., Wang, B., Wang, B. L., Wang, C. W., Wang, D. Y., Wang, F., Wang, H. J., Wang, H. P., Wang, K., Wang, L. L., Wang, M., Wang, Meng, Wang, S., Wang, T., Wang, T. J., Wang, W., Wang, W. H., Wang, W. P., Wang, X., Wang, X. F., Wang, X. L., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. H., Wang, Y. Q., Wang, Yaqian, Wang, Z., Wang, Z. Y., Wang, Ziyi, Wei, D. H., Weidner, F., Wen, S. P., White, D. J., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, X. H., Wu, Y., Wu, Y. J, Wu, Z., Xia, L., Xiang, T., Xiao, D., Xiao, G. Y., Xiao, H., Xiao, S. Y., Xiao, Y. L., Xiao, Z. J., Xie, C., Xie, X. H., Xie, Y., Xie, Y. G., Xie, Y. H., Xie, Z. P., Xing, T. Y., Xu, C. F., Xu, C. J., Xu, G. F., Xu, H. Y., Xu, Q. J., Xu, X. P., Xu, Y. C., Xu, Z. P., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yang, H. J., Yang, H. L., Yang, H. X., Yang, Tao, Yang, Y. F., Yang, Y. X., Yang, Yifan, Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, T., Yu, X. D., Yuan, C. Z., Yuan, L., Yuan, S. C., Yuan, X. Q., Yuan, Y., Yuan, Z. Y., Yue, C. X., Zafar, A. A., Zeng, F. R., Zeng, X., Zeng, Y., Zhai, X. Y., Zhan, Y. H., Zhang, A. Q., Zhang, B. L., Zhang, B. X., Zhang, D. H., Zhang, G. Y., Zhang, H., Zhang, H. H., Zhang, H. Q., Zhang, H. Y., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. W., Zhang, J. X., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, Jiawei, Zhang, L. M., Zhang, L. Q., Zhang, Lei, Zhang, P., Zhang, Q. Y., Zhang, Shuihan, Zhang, Shulei, Zhang, X. D., Zhang, X. M., Zhang, X. Y., Zhang, Y., Zhang, Y. T., Zhang, Y. H., Zhang, Yan, Zhang, Yao, Zhang, Z. H., Zhang, Z. L., Zhang, Z. Y., Zhao, G., Zhao, J., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, J. P., Zheng, Y. H., Zhong, B., Zhong, C., Zhong, X., Zhou, H., Zhou, L. P., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhou, Y. Z., Zhu, J., Zhu, K., Zhu, K. J., Zhu, L. X., Zhu, S. H., Zhu, S. Q., Zhu, T. J., Zhu, W. J., Zhu, Y. C., Zhu, Z. A., Zou, J. H., and Zu, J.
- Subjects
High Energy Physics - Experiment - Abstract
We present the first search for the leptonic decays $D^{*+}\to e^+\nu_e$ and $D^{*+}\to \mu^+\nu_\mu$ by analyzing a data sample of electron-positron collisions recorded with the BESIII detector at center-of-mass energies between 4.178 and 4.226 GeV, corresponding to an integrated luminosity of 6.32~fb$^{-1}$. No significant signal is observed. The upper limits on the branching fractions for $D^{*+}\to e^+\nu_e$ and $D^{*+}\to \mu^+\nu_\mu$ are set to be $1.1 \times 10^{-5}$ and $4.3 \times 10^{-6}$ at 90\% confidence level, respectively., Comment: 14 pages, 7 figures
- Published
- 2024
36. Search for the radiative transition $\chi_{c1}(3872)\to\gamma \psi_2(3823)$
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Afedulidis, O., Ai, X. C., Aliberti, R., Amoroso, A., An, M. R., An, Q., Bai, Y., Bakina, O., Balossino, I., Ban, Y., Bao, H. -R., Batozskaya, V., Begzsuren, K., Berger, N., Berlowski, M., Bertani, M., Bettoni, D., Bianchi, F., Bianco, E., Bortone, A., Boyko, I., Briere, R. A., Brueggemann, A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Che, G. R., Chelkov, G., Chen, C., Chen, C. H., Chen, Chao, Chen, G., Chen, H. S., Chen, H. Y., Chen, M. L., Chen, S. J., Chen, S. L., Chen, S. M., Chen, T., Chen, X. R., Chen, X. T., Chen, Y. B., Chen, Y. Q., Chen, Z. J., Chen, Z. Y., Choi, S. K., Cibinetto, G., Cossio, F., Cui, J. J., Dai, H. L., Dai, J. P., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, C. Q., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, B., Ding, X. X., Ding, Y., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, M. C., Du, S. X., Duan, Z. H., Egorov, P., Fan, Y. H., Fang, J., Fang, S. S., Fang, W. X., Fang, Y., Fang, Y. Q., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Feng, Y. T., Fritsch, M., Fu, C. D., Fu, J. L., Fu, Y. W., Gao, H., Gao, X. B., Gao, Y. N., Gao, Yang, Garbolino, S., Garzia, I., Ge, L., Ge, P. T., Ge, Z. W., Geng, C., Gersabeck, E. M., Gilman, A., Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Gramigna, S., Greco, M., Gu, M. H., Gu, Y. T., Guan, C. Y., Guan, Z. L., Guo, A. Q., Guo, L. B., Guo, M. J., Guo, R. P., Guo, Y. P., Guskov, A., Gutierrez, J., Han, K. L., Han, T. T., Hao, X. Q., Harris, F. A., He, K. K., He, K. L., Heinsius, F. H., Heinz, C. H., Heng, Y. K., Herold, C., Holtmann, T., Hong, P. C., Hou, G. Y., Hou, X. T., Hou, Y. R., Hou, Z. L., Hu, B. Y., Hu, H. M., Hu, J. F., Hu, S. L., Hu, T., Hu, Y., Huang, G. S., Huang, K. X., Huang, L. Q., Huang, X. T., Huang, Y. P., Hussain, T., Hölzken, F., Hüsken, N, der Wiesche, N. in, Jackson, J., Janchiv, S., Jeong, J. H., Ji, Q., Ji, Q. P., Ji, W., Ji, X. B., Ji, X. L., Ji, Y. Y., Jia, X. Q., Jia, Z. K., Jiang, D., Jiang, H. B., Jiang, P. C., Jiang, S. S., Jiang, T. J., Jiang, X. S., Jiang, Y., Jiao, J. B., Jiao, J. K., Jiao, Z., Jin, S., Jin, Y., Jing, M. Q., Jing, X. M., Johansson, T., Kabana, S., Kalantar-Nayestanaki, N., Kang, X. L., Kang, X. S., Kavatsyuk, M., Ke, B. C., Khachatryan, V., Khoukaz, A., Kiuchi, R., Kolcu, O. B., Kopf, B., Kuessner, M., Kui, X., Kumar, N., Kupsc, A., Kühn, W., Lane, J. J., Larin, P., Lavezzi, L., Lei, T. T., Lei, Z. H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H. B., Li, H. J., Li, H. N., Li, Hui, Li, J. R., Li, J. S., Li, Ke, Li, L. J, Li, L. K., Li, Lei, Li, M. H., Li, P. R., Li, Q. M., Li, Q. X., Li, R., Li, S. X., Li, T., Li, W. D., Li, W. G., Li, X., Li, X. H., Li, X. L., Li, X. Z., Li, Xiaoyu, Li, Y. G., Li, Z. J., Li, Z. X., Liang, C., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Libby, J., Limphirat, A., Lin, C. C., Lin, D. X., Lin, T., Liu, B. J., Liu, B. X., Liu, C., Liu, C. X., Liu, F. H., Liu, Fang, Liu, Feng, Liu, G. M., Liu, H., Liu, H. B., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, L. C., Liu, Lu, Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, T., Liu, W. K., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. D., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Lu, Z. H., Luo, C. L., Luo, M. X., Luo, T., Luo, X. L., Lyu, X. R., Lyu, Y. F., Ma, F. C., Ma, H., Ma, H. L., Ma, J. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, T., Ma, X. T., Ma, X. Y., Ma, Y., Ma, Y. M., Maas, F. E., Maggiora, M., Malde, S., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Miao, H., Min, T. J., Mitchell, R. E., Mo, X. H., Moses, B., Muchnoi, N. Yu., Muskalla, J., Nefedov, Y., Nerling, F., Nie, L. S., Nikolaev, I. B., Ning, Z., Nisar, S., Niu, Q. L., Niu, W. D., Niu, Y., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Patteri, P., Pei, Y. P., Pelizaeus, M., Peng, H. P., Peng, Y. Y., Peters, K., Ping, J. L., Ping, R. G., Plura, S., Prasad, V., Qi, F. Z., Qi, H., Qi, H. R., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qiao, C. F., Qiao, X. K., Qin, J. J., Qin, L. Q., Qin, L. Y., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, Z. H., Redmer, C. F., Ren, K. J., Rivetti, A., Rolo, M., Rong, G., Rosner, Ch., Ruan, S. N., Salone, N., Sarantsev, A., Schelhaas, Y., Schoenning, K., Scodeggio, M., Shan, K. Y., Shan, W., Shan, X. Y., Shang, Z. J, Shangguan, J. F., Shao, L. G., Shao, M., Shen, C. P., Shen, H. F., Shen, W. H., Shen, X. Y., Shi, B. A., Shi, H., Shi, H. C., Shi, J. L., Shi, J. Y., Shi, Q. Q., Shi, S. Y., Shi, X., Song, J. J., Song, T. Z., Song, W. M., Song, Y. J., Song, Y. X., Sosio, S., Spataro, S., Stieler, F., Su, Y. J., Sun, G. B., Sun, G. X., Sun, H., Sun, H. K., Sun, J. F., Sun, K., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, Y., Sun, Y. J., Sun, Y. Z., Sun, Z. Q., Sun, Z. T., Tang, C. J., Tang, G. Y., Tang, J., Tang, M., Tang, Y. A., Tao, L. Y., Tao, Q. T., Tat, M., Teng, J. X., Thoren, V., Tian, W. H., Tian, Y., Tian, Z. F., Uman, I., Wan, Y., Wang, S. J., Wang, B., Wang, B. L., Wang, Bo, Wang, D. Y., Wang, F., Wang, H. J., Wang, J. J., Wang, J. P., Wang, K., Wang, L. L., Wang, M., Wang, Meng, Wang, N. Y., Wang, S., Wang, T., Wang, T. J., Wang, W., Wang, W. P., Wang, X., Wang, X. F., Wang, X. J., Wang, X. L., Wang, X. N., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. L., Wang, Y. N., Wang, Y. Q., Wang, Yaqian, Wang, Yi, Wang, Z., Wang, Z. L., Wang, Z. Y., Wang, Ziyi, Wei, D. H., Weidner, F., Wen, S. P., Wen, Y. R., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, C., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, X. H., Wu, Y., Wu, Y. H., Wu, Y. J., Wu, Z., Xia, L., Xian, X. M., Xiang, B. H., Xiang, T., Xiao, D., Xiao, G. Y., Xiao, S. Y., Xiao, Y. L., Xiao, Z. J., Xie, C., Xie, X. H., Xie, Y., Xie, Y. G., Xie, Y. H., Xie, Z. P., Xing, T. Y., Xu, C. F., Xu, C. J., Xu, G. F., Xu, H. Y., Xu, M., Xu, Q. J., Xu, Q. N., Xu, W., Xu, W. L., Xu, X. P., Xu, Y. C., Xu, Z. P., Xu, Z. S., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yan, X. Q., Yang, H. J., Yang, H. L., Yang, H. X., Yang, Tao, Yang, Y., Yang, Y. F., Yang, Y. X., Yang, Yifan, Yang, Z. W., Yao, Z. P., Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, J. S., Yu, T., Yu, X. D., Yu, Y. C., Yuan, C. Z., Yuan, J., Yuan, L., Yuan, S. C., Yuan, Y., Yuan, Y. J., Yuan, Z. Y., Yue, C. X., Zafar, A. A., Zeng, F. R., Zeng, S. H., Zeng, X., Zeng, Y., Zeng, Y. J., Zhai, X. Y., Zhai, Y. C., Zhan, Y. H., Zhang, A. Q., Zhang, B. L., Zhang, B. X., Zhang, D. H., Zhang, G. Y., Zhang, H., Zhang, H. C., Zhang, H. H., Zhang, H. Q., Zhang, H. R., Zhang, H. Y., Zhang, J., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. S., Zhang, J. W., Zhang, J. X., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, L. M., Zhang, Lei, Zhang, P., Zhang, Q. Y., Zhang, R. Y, Zhang, Shuihan, Zhang, Shulei, Zhang, X. D., Zhang, X. M., Zhang, X. Y., Zhang, Y., Zhang, Y. T., Zhang, Y. H., Zhang, Y. M., Zhang, Yan, Zhang, Yao, Zhang, Z. D., Zhang, Z. H., Zhang, Z. L., Zhang, Z. Y., Zhang, Z. Z., Zhao, G., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, N., Zhao, R. P., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, B. M., Zheng, J. P., Zheng, W. J., Zheng, Y. H., Zhong, B., Zhong, X., Zhou, H., Zhou, J. Y., Zhou, L. P., Zhou, S., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhou, Y. Z., Zhu, J., Zhu, K., Zhu, K. J., Zhu, K. S., Zhu, L., Zhu, L. X., Zhu, S. H., Zhu, S. Q., Zhu, T. J., Zhu, W. D., Zhu, Y. C., Zhu, Z. A., Zou, J. H., and Zu, J.
- Subjects
High Energy Physics - Experiment - Abstract
Using 9.0 $\rm fb^{-1}$ of $e^+e^-$ collision data collected at center-of-mass energies from 4.178 to 4.278 GeV with the BESIII detector at the BEPCII collider, we perform the first search for the radiative transition $\chi_{c1}(3872)\to\gamma \psi_2(3823)$. No $\chi_{c1}(3872)\to\gamma \psi_2(3823)$ signal is observed. The upper limit on the ratio of branching fractions $\mathcal{B}(\chi_{c1}(3872)\to\gamma \psi_2(3823), \psi_2(3823)\to\gamma\chi_{c1})/\mathcal{B}(\chi_{c1}(3872)\to\pi^+\pi^- J/\psi)$ is set as 0.075 at the 90\% confidence level. Our result contradicts theoretical predictions under the assumption that the $\chi_{c1}(3872)$ is the pure charmonium state $\chi_{c1}(2P)$., Comment: 8 pages, 2 figures
- Published
- 2024
- Full Text
- View/download PDF
37. Multi-task learning for molecular electronic structure approaching coupled-cluster accuracy
- Author
-
Tang, Hao, Xiao, Brian, He, Wenhao, Subasic, Pero, Harutyunyan, Avetik R., Wang, Yao, Liu, Fang, Xu, Haowei, and Li, Ju
- Subjects
Physics - Chemical Physics ,Condensed Matter - Materials Science ,Computer Science - Artificial Intelligence ,Computer Science - Computational Engineering, Finance, and Science ,Physics - Computational Physics - Abstract
Machine learning (ML) plays an important role in quantum chemistry, providing fast-to-evaluate predictive models for various properties of molecules. However, most existing ML models for molecular electronic properties use density functional theory (DFT) databases as ground truth in training, and their prediction accuracy cannot surpass that of DFT. In this work, we developed a unified ML method for electronic structures of organic molecules using the gold-standard CCSD(T) calculations as training data. Tested on hydrocarbon molecules, our model outperforms DFT with the widely-used hybrid and double hybrid functionals in computational costs and prediction accuracy of various quantum chemical properties. As case studies, we apply the model to aromatic compounds and semiconducting polymers on both ground state and excited state properties, demonstrating its accuracy and generalization capability to complex systems that are hard to calculate using CCSD(T)-level methods.
- Published
- 2024
38. Edit-Your-Motion: Space-Time Diffusion Decoupling Learning for Video Motion Editing
- Author
-
Zuo, Yi, Li, Lingling, Jiao, Licheng, Liu, Fang, Liu, Xu, Ma, Wenping, Yang, Shuyuan, and Guo, Yuwei
- Subjects
Computer Science - Computer Vision and Pattern Recognition - Abstract
Existing diffusion-based methods have achieved impressive results in human motion editing. However, these methods often exhibit significant ghosting and body distortion in unseen in-the-wild cases. In this paper, we introduce Edit-Your-Motion, a video motion editing method that tackles these challenges through one-shot fine-tuning on unseen cases. Specifically, firstly, we utilized DDIM inversion to initialize the noise, preserving the appearance of the source video and designed a lightweight motion attention adapter module to enhance motion fidelity. DDIM inversion aims to obtain the implicit representations by estimating the prediction noise from the source video, which serves as a starting point for the sampling process, ensuring the appearance consistency between the source and edited videos. The Motion Attention Module (MA) enhances the model's motion editing ability by resolving the conflict between the skeleton features and the appearance features. Secondly, to effectively decouple motion and appearance of source video, we design a spatio-temporal two-stage learning strategy (STL). In the first stage, we focus on learning temporal features of human motion and propose recurrent causal attention (RCA) to ensure consistency between video frames. In the second stage, we shift focus on learning the appearance features of the source video. With Edit-Your-Motion, users can edit the motion of humans in the source video, creating more engaging and diverse content. Extensive qualitative and quantitative experiments, along with user preference studies, show that Edit-Your-Motion outperforms other methods.
- Published
- 2024
39. LingML: Linguistic-Informed Machine Learning for Enhanced Fake News Detection
- Author
-
Singh, Jasraj, Liu, Fang, Xu, Hong, Ng, Bee Chin, and Zhang, Wei
- Subjects
Computer Science - Computation and Language - Abstract
Nowadays, Information spreads at an unprecedented pace in social media and discerning truth from misinformation and fake news has become an acute societal challenge. Machine learning (ML) models have been employed to identify fake news but are far from perfect with challenging problems like limited accuracy, interpretability, and generalizability. In this paper, we enhance ML-based solutions with linguistics input and we propose LingML, linguistic-informed ML, for fake news detection. We conducted an experimental study with a popular dataset on fake news during the pandemic. The experiment results show that our proposed solution is highly effective. There are fewer than two errors out of every ten attempts with only linguistic input used in ML and the knowledge is highly explainable. When linguistics input is integrated with advanced large-scale ML models for natural language processing, our solution outperforms existing ones with 1.8% average error rate. LingML creates a new path with linguistics to push the frontier of effective and efficient fake news detection. It also sheds light on real-world multi-disciplinary applications requiring both ML and domain expertise to achieve optimal performance., Comment: 7 pages
- Published
- 2024
40. Exploring Beyond Logits: Hierarchical Dynamic Labeling Based on Embeddings for Semi-Supervised Classification
- Author
-
Ma, Yanbiao, Jiao, Licheng, Liu, Fang, Li, Lingling, Yang, Shuyuan, and Liu, Xu
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence - Abstract
In semi-supervised learning, methods that rely on confidence learning to generate pseudo-labels have been widely proposed. However, increasing research finds that when faced with noisy and biased data, the model's representation network is more reliable than the classification network. Additionally, label generation methods based on model predictions often show poor adaptability across different datasets, necessitating customization of the classification network. Therefore, we propose a Hierarchical Dynamic Labeling (HDL) algorithm that does not depend on model predictions and utilizes image embeddings to generate sample labels. We also introduce an adaptive method for selecting hyperparameters in HDL, enhancing its versatility. Moreover, HDL can be combined with general image encoders (e.g., CLIP) to serve as a fundamental data processing module. We extract embeddings from datasets with class-balanced and long-tailed distributions using pre-trained semi-supervised models. Subsequently, samples are re-labeled using HDL, and the re-labeled samples are used to further train the semi-supervised models. Experiments demonstrate improved model performance, validating the motivation that representation networks are more reliable than classifiers or predictors. Our approach has the potential to change the paradigm of pseudo-label generation in semi-supervised learning.
- Published
- 2024
41. FedStyle: Style-Based Federated Learning Crowdsourcing Framework for Art Commissions
- Author
-
Ran, Changjuan, Guo, Yeting, Liu, Fang, Cui, Shenglan, and Ye, Yunfan
- Subjects
Computer Science - Machine Learning ,Computer Science - Computer Vision and Pattern Recognition - Abstract
The unique artistic style is crucial to artists' occupational competitiveness, yet prevailing Art Commission Platforms rarely support style-based retrieval. Meanwhile, the fast-growing generative AI techniques aggravate artists' concerns about releasing personal artworks to public platforms. To achieve artistic style-based retrieval without exposing personal artworks, we propose FedStyle, a style-based federated learning crowdsourcing framework. It allows artists to train local style models and share model parameters rather than artworks for collaboration. However, most artists possess a unique artistic style, resulting in severe model drift among them. FedStyle addresses such extreme data heterogeneity by having artists learn their abstract style representations and align with the server, rather than merely aggregating model parameters lacking semantics. Besides, we introduce contrastive learning to meticulously construct the style representation space, pulling artworks with similar styles closer and keeping different ones apart in the embedding space. Extensive experiments on the proposed datasets demonstrate the superiority of FedStyle., Comment: Accepted to ICME 2024
- Published
- 2024
42. Challenges of Using Pre-trained Models: the Practitioners' Perspective
- Author
-
Tan, Xin, Li, Taichuan, Chen, Ruohe, Liu, Fang, and Zhang, Li
- Subjects
Computer Science - Software Engineering - Abstract
The challenges associated with using pre-trained models (PTMs) have not been specifically investigated, which hampers their effective utilization. To address this knowledge gap, we collected and analyzed a dataset of 5,896 PTM-related questions on Stack Overflow. We first analyze the popularity and difficulty trends of PTM-related questions. We find that PTM-related questions are becoming more and more popular over time. However, it is noteworthy that PTM-related questions not only have a lower response rate but also exhibit a longer response time compared to many well-researched topics in software engineering. This observation emphasizes the significant difficulty and complexity associated with the practical application of PTMs. To delve into the specific challenges, we manually annotate 430 PTM-related questions, categorizing them into a hierarchical taxonomy of 42 codes (i.e., leaf nodes) and three categories. This taxonomy encompasses many PTM prominent challenges such as fine-tuning, output understanding, and prompt customization, which reflects the gaps between current techniques and practical needs. We discuss the implications of our study for PTM practitioners, vendors, and educators, and suggest possible directions and solutions for future research., Comment: SANER 2024
- Published
- 2024
43. Unveiling and Mitigating Generalized Biases of DNNs through the Intrinsic Dimensions of Perceptual Manifolds
- Author
-
Ma, Yanbiao, Jiao, Licheng, Liu, Fang, Li, Lingling, Ma, Wenping, Yang, Shuyuan, Liu, Xu, and Chen, Puhua
- Subjects
Computer Science - Computer Vision and Pattern Recognition ,Computer Science - Artificial Intelligence - Abstract
Building fair deep neural networks (DNNs) is a crucial step towards achieving trustworthy artificial intelligence. Delving into deeper factors that affect the fairness of DNNs is paramount and serves as the foundation for mitigating model biases. However, current methods are limited in accurately predicting DNN biases, relying solely on the number of training samples and lacking more precise measurement tools. Here, we establish a geometric perspective for analyzing the fairness of DNNs, comprehensively exploring how DNNs internally shape the intrinsic geometric characteristics of datasets-the intrinsic dimensions (IDs) of perceptual manifolds, and the impact of IDs on the fairness of DNNs. Based on multiple findings, we propose Intrinsic Dimension Regularization (IDR), which enhances the fairness and performance of models by promoting the learning of concise and ID-balanced class perceptual manifolds. In various image recognition benchmark tests, IDR significantly mitigates model bias while improving its performance., Comment: 8pages, 6figures, Submitted to TPAMI
- Published
- 2024
44. Exploring and Unleashing the Power of Large Language Models in Automated Code Translation
- Author
-
Yang, Zhen, Liu, Fang, Yu, Zhongxing, Keung, Jacky Wai, Li, Jia, Liu, Shuo, Hong, Yifan, Ma, Xiaoxue, Jin, Zhi, and Li, Ge
- Subjects
Computer Science - Software Engineering ,Computer Science - Artificial Intelligence - Abstract
Code translation tools (transpilers) are developed for automatic source-to-source translation. Although learning-based transpilers have shown impressive enhancement against rule-based counterparts, owing to their task-specific pre-training on extensive monolingual corpora. Their current performance still remains unsatisfactory for practical deployment, and the associated training resources are also prohibitively expensive. LLMs pre-trained on huge amounts of human-written code/text have shown remarkable performance in many code intelligence tasks due to their powerful generality, even without task-specific training. Thus, LLMs can potentially circumvent the above limitations, but they have not been exhaustively explored yet. This paper investigates diverse LLMs and learning-based transpilers for automated code translation tasks, finding that: although certain LLMs have outperformed current transpilers, they still have some accuracy issues, where most of the failures are induced by a lack of comprehension of source programs, missing clear instructions on I/O types in translation, and ignoring discrepancies between source and target programs. Enlightened by the above findings, we further propose UniTrans, a Unified code Translation framework, applicable to various LLMs, for unleashing their power in this field. Specifically, UniTrans first crafts a series of test cases for target programs with the assistance of source programs. Next, it harnesses the above auto-generated test cases to augment the code translation and then evaluate their correctness via execution. Afterward, UniTrans further (iteratively) repairs incorrectly translated programs prompted by test case execution results. Extensive experiments are conducted on six settings of translation datasets between Python, Java, and C++. Three recent LLMs of diverse sizes are tested with UniTrans, and all achieve substantial improvements., Comment: 23 pages, 7 figures, accepted by FSE'24 (2024 ACM International Conference on the Foundations of Software Engineering)
- Published
- 2024
45. Demystify Adult Learning: A Social Network and Large Language Model Assisted Approach
- Author
-
Liu, Fang, Ding, Bosheng, Guan, Chong, Wei, Zhang, Niyato, Dusit, and Tan, Justina
- Subjects
Computer Science - Social and Information Networks - Abstract
Adult learning is increasingly recognized as a crucial way for personal development and societal progress. It however is challenging, and adult learners face unique challenges such as balancing education with other life responsibilities. Collecting feedback from adult learners is effective in understanding their concerns and improving learning experiences, and social networks provide a rich source of real-time sentiment data from adult learners. Machine learning technologies especially large language models (LLMs) perform well in automating sentiment analysis. However, none of such models is specialized for adult learning with accurate sentiment understanding. In this paper, we present A-Learn, which enhances adult learning sentiment analysis by customizing existing general-purpose LLMs with domain-specific datasets for adult learning. We collect adult learners' comments from social networks and label the sentiment of each comment with an existing LLM to form labelled datasets tailored for adult learning. The datasets are used to customize A-Learn from several base LLMs. We conducted experimental studies and the results reveal A-Learn's competitive sentiment analysis performance, achieving up to 91.3% accuracy with 20% improvement over the base LLM. A-Learn is also employed for word cloud analysis to identify key concerns of adult learners. The research outcome of this study highlights the importance of applying machine learning with educational expertise for teaching improvement and educational innovations that benefit adult learning and adult learners., Comment: 6 pages, 3 figures
- Published
- 2024
46. Measurement of $e^{+}e^{-}\to \omega\eta^{\prime}$ cross sections at $\sqrt{s}=$ 2.000 to 3.080 GeV
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Ai, X. C., Aliberti, R., Amoroso, A., An, M. R., An, Q., Bai, Y., Bakina, O., Balossino, I., Ban, Y., Batozskaya, V., Begzsuren, K., Berger, N., Berlowski, M., Bertani, M., Bettoni, D., Bianchi, F., Bianco, E., Bortone, A., Boyko, I., Briere, R. A., Brueggemann, A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Chang, T. T., Chang, W. L., Che, G. R., Chelkov, G., Chen, C., Chen, Chao, Chen, G., Chen, H. S., Chen, M. L., Chen, S. J., Chen, S. M., Chen, T., Chen, X. R., Chen, X. T., Chen, Y. B., Chen, Y. Q., Chen, Z. J., Cheng, W. S., Choi, S. K., Chu, X., Cibinetto, G., Coen, S. C., Cossio, F., Cui, J. J., Dai, H. L., Dai, J. P., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, B., Ding, X. X., Ding, Y., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, M. C., Du, S. X., Duan, Z. H., Egorov, P., Fan, Y. H. Y., Fan, Y. L., Fang, J., Fang, S. S., Fang, W. X., Fang, Y., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Fischer, K, Fritsch, M., Fritzsch, C., Fu, C. D., Fu, J. L., Fu, Y. W., Gao, H., Gao, Y. N., Gao, Yang, Garbolino, S., Garzia, I., Ge, P. T., Ge, Z. W., Geng, C., Gersabeck, E. M., Gilman, A, Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Gramigna, S., Greco, M., Gu, M. H., Guan, C. Y, Guan, Z. L., Guo, A. Q., Guo, L. B., Guo, M. J., Guo, R. P., Guo, Y. P., Guskov, A., Han, T. T., Han, W. Y., Hao, X. Q., Harris, F. A., He, K. K., He, K. L., Heinsius, F. H. H., Heinz, C. H., Heng, Y. K., Herold, C., Holtmann, T., Hong, P. C., Hou, G. Y., Hou, X. T., Hou, Y. R., Hou, Z. L., Hu, H. M., Hu, J. F., Hu, T., Hu, Y., Huang, G. S., Huang, K. X., Huang, L. Q., Huang, X. T., Huang, Y. P., Hussain, T., Hüsken, N, Imoehl, W., Jackson, J., Jaeger, S., Janchiv, S., Jeong, J. H., Ji, Q., Ji, Q. P., Ji, X. B., Ji, X. L., Ji, Y. Y., Jia, X. Q., Jia, Z. K., Jiang, H. J., Jiang, P. C., Jiang, S. S., Jiang, T. J., Jiang, X. S., Jiang, Y., Jiao, J. B., Jiao, Z., Jin, S., Jin, Y., Jing, M. Q., Johansson, T., K., X., Kabana, S., Kalantar-Nayestanaki, N., Kang, X. L., Kang, X. S., Kappert, R., Kavatsyuk, M., Ke, B. C., Khoukaz, A., Kiuchi, R., Kliemt, R., Kolcu, O. B., Kopf, B., Kuessner, M., Kupsc, A., Kühn, W., Lane, J. J., Larin, P., Lavania, A., Lavezzi, L., Lei, T. T., Lei, Z. H., Leithoff, H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H., Li, H. B., Li, H. J., Li, H. N., Li, Hui, Li, J. R., Li, J. S., Li, J. W., Li, K. L., Li, Ke, Li, L. J, Li, L. K., Li, Lei, Li, M. H., Li, P. R., Li, Q. X., Li, S. X., Li, T., Li, W. D., Li, W. G., Li, X. H., Li, X. L., Li, Xiaoyu, Li, Y. G., Li, Z. J., Liang, C., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Liao, Y. P., Libby, J., Limphirat, A., Lin, D. X., Lin, T., Liu, B. J., Liu, B. X., Liu, C., Liu, C. X., Liu, F. H., Liu, Fang, Liu, Feng, Liu, G. M., Liu, H., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. L., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, L. C., Liu, Lu, Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, T., Liu, W. K., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Lu, Z. H., Luo, C. L., Luo, M. X., Luo, T., Luo, X. L., Lyu, X. R., Lyu, Y. F., Ma, F. C., Ma, H. L., Ma, J. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, R. T., Ma, X. Y., Ma, Y., Ma, Y. M., Maas, F. E., Maggiora, M., Malde, S., Malik, Q. A., Mangoni, A., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Miao, H., Min, T. J., Mitchell, R. E., Mo, X. H., Muchnoi, N. Yu., Muskalla, J., Nefedov, Y., Nerling, F., Nikolaev, I. B., Ning, Z., Nisar, S., Niu, Y., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Patteri, P., Pei, Y. P., Pelizaeus, M., Peng, H. P., Peters, K., Ping, J. L., Ping, R. G., Plura, S., Pogodin, S., Prasad, V., Qi, F. Z., Qi, H., Qi, H. R., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qiao, C. F., Qin, J. J., Qin, L. Q., Qin, X. P., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, S. Q., Redmer, C. F., Ren, K. J., Rivetti, A., Rodin, V., Rolo, M., Rong, G., Rosner, Ch., Ruan, S. N., Salone, N., Sarantsev, A., Schelhaas, Y., Schoenning, K., Scodeggio, M., Shan, K. Y., Shan, W., Shan, X. Y., Shangguan, J. F., Shao, L. G., Shao, M., Shen, C. P., Shen, H. F., Shen, W. H., Shen, X. Y., Shi, B. A., Shi, H. C., Shi, J. L., Shi, J. Y., Shi, Q. Q., Shi, R. S., Shi, X., Song, J. J., Song, T. Z., Song, W. M., Song, Y. J., Song, Y. X., Sosio, S., Spataro, S., Stieler, F., Su, Y. J., Sun, G. B., Sun, G. X., Sun, H., Sun, H. K., Sun, J. F., Sun, K., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, Y., Sun, Y. J., Sun, Y. Z., Sun, Z. T., Tan, Y. X., Tang, C. J., Tang, G. Y., Tang, J., Tang, Y. A., Tao, L. Y, Tao, Q. T., Tat, M., Teng, J. X., Thoren, V., Tian, W. H., Tian, Y., Tian, Z. F., Uman, I., Wang, S. J., Wang, B., Wang, B. L., Wang, Bo, Wang, C. W., Wang, D. Y., Wang, F., Wang, H. J., Wang, H. P., Wang, J. P., Wang, K., Wang, L. L., Wang, M., Wang, Meng, Wang, S., Wang, T., Wang, T. J., Wang, W., Wang, W. P., Wang, X., Wang, X. F., Wang, X. J., Wang, X. L., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. H., Wang, Y. N., Wang, Y. Q., Wang, Yaqian, Wang, Yi, Wang, Z., Wang, Z. L., Wang, Z. Y., Wang, Ziyi, Wei, D., Wei, D. H., Weidner, F., Wen, S. P., Wenzel, C. W., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, C., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, X. H., Wu, Y., Wu, Y. J., Wu, Z., Xia, L., Xian, X. M., Xiang, T., Xiao, D., Xiao, G. Y., Xiao, S. Y., Xiao, Y. L., Xiao, Z. J., Xie, C., Xie, X. H., Xie, Y., Xie, Y. G., Xie, Y. H., Xie, Z. P., Xing, T. Y., Xu, C. F., Xu, C. J., Xu, G. F., Xu, H. Y., Xu, Q. J., Xu, Q. N., Xu, W., Xu, W. L., Xu, X. P., Xu, Y. C., Xu, Z. P., Xu, Z. S., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yan, X. Q., Yang, H. J., Yang, H. L., Yang, H. X., Yang, Tao, Yang, Y., Yang, Y. F., Yang, Y. X., Yang, Yifan, Yang, Z. W., Yao, Z. P., Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, J. S., Yu, T., Yu, X. D., Yuan, C. Z., Yuan, L., Yuan, S. C., Yuan, X. Q., Yuan, Y., Yuan, Z. Y., Yue, C. X., Zafar, A. A., Zeng, F. R., Zeng, X., Zeng, Y., Zeng, Y. J., Zhai, X. Y., Zhai, Y. C., Zhan, Y. H., Zhang, A. Q., Zhang, B. L., Zhang, B. X., Zhang, D. H., Zhang, G. Y., Zhang, H., Zhang, H. H., Zhang, H. Q., Zhang, H. Y., Zhang, J., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. W., Zhang, J. X., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, Jiawei, Zhang, L. M., Zhang, L. Q., Zhang, Lei, Zhang, P., Zhang, Q. Y., Zhang, Shuihan, Zhang, Shulei, Zhang, X. D., Zhang, X. M., Zhang, X. Y., Zhang, Xuyan, Zhang, Y., Zhang, Y. T., Zhang, Y. H., Zhang, Yan, Zhang, Yao, Zhang, Z. H., Zhang, Z. L., Zhang, Z. Y., Zhao, G., Zhao, J., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, J. P., Zheng, W. J., Zheng, Y. H., Zhong, B., Zhong, X., Zhou, H., Zhou, L. P., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhou, Y. Z., Zhu, J., Zhu, K., Zhu, K. J., Zhu, L., Zhu, L. X., Zhu, S. H., Zhu, S. Q., Zhu, T. J., Zhu, W. J., Zhu, Y. C., Zhu, Z. A., Zou, J. H., and Zu, J.
- Subjects
High Energy Physics - Experiment - Abstract
The Born cross sections for the process $e^{+}e^{-}\to \omega\eta^{\prime}$ are measured at 22 center-of-mass energies from 2.000 to 3.080 GeV using data collected with the BESIII detector at the BEPCII collider. A resonant structure is observed with a statistical significance of 9.6$\sigma$. A Breit-Wigner fit determines its mass to be $M_R=(2153\pm30\pm31)~{\rm{MeV}}/c^{2}$ and its width to be $\Gamma_{R}=(167\pm77\pm7)~\rm{MeV}$, where the first uncertainties are statistical and the second are systematic.
- Published
- 2024
47. Measurement of the Born cross section for $e^{+}e^{-}\to \eta h_c $ at center-of-mass energies between 4.1 and 4.6\,GeV
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Afedulidis, O., Ai, X. C., Aliberti, R., Amoroso, A., An, Q., Bai, Y., Bakina, O., Balossino, I., Ban, Y., Bao, H. -R., Batozskaya, V., Begzsuren, K., Berger, N., Berlowski, M., Bertani, M., Bettoni, D., Bianchi, F., Bianco, E., Bortone, A., Boyko, I., Briere, R. A., Brueggemann, A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Che, G. R., Chelkov, G., Chen, C., Chen, C. H., Chen, Chao, Chen, G., Chen, H. S., Chen, H. Y., Chen, M. L., Chen, S. J., Chen, S. L., Chen, S. M., Chen, T., Chen, X. R., Chen, X. T., Chen, Y. B., Chen, Y. Q., Chen, Z. J., Chen, Z. Y., Choi, S. K., Cibinetto, G., Cossio, F., Cui, J. J., Dai, H. L., Dai, J. P., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, C. Q., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, B., Ding, X. X., Ding, Y., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, M. C., Du, S. X., Duan, Z. H., Egorov, P., Fan, Y. H., Fang, J., Fang, S. S., Fang, W. X., Fang, Y., Fang, Y. Q., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Feng, Y. T., Fritsch, M., Fu, C. D., Fu, J. L., Fu, Y. W., Gao, H., Gao, X. B., Gao, Y. N., Gao, Yang, Garbolino, S., Garzia, I., Ge, L., Ge, P. T., Ge, Z. W., Geng, C., Gersabeck, E. M., Gilman, A., Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Gramigna, S., Greco, M., Gu, M. H., Gu, Y. T., Guan, C. Y., Guan, Z. L., Guo, A. Q., Guo, L. B., Guo, M. J., Guo, R. P., Guo, Y. P., Guskov, A., Gutierrez, J., Han, K. L., Han, T. T., Hao, X. Q., Harris, F. A., He, K. K., He, K. L., Heinsius, F. H., Heinz, C. H., Heng, Y. K., Herold, C., Holtmann, T., Hong, P. C., Hou, G. Y., Hou, X. T., Hou, Y. R., Hou, Z. L., Hu, B. Y., Hu, H. M., Hu, J. F., Hu, S. L., Hu, T., Hu, Y., Huang, G. S., Huang, K. X., Huang, L. Q., Huang, X. T., Huang, Y. P., Hussain, T., Hölzken, F., Hüsken, N, der Wiesche, N. in, Jackson, J., Janchiv, S., Jeong, J. H., Ji, Q., Ji, Q. P., Ji, W., Ji, X. B., Ji, X. L., Ji, Y. Y., Jia, X. Q., Jia, Z. K., Jiang, D., Jiang, H. B., Jiang, P. C., Jiang, S. S., Jiang, T. J., Jiang, X. S., Jiang, Y., Jiao, J. B., Jiao, J. K., Jiao, Z., Jin, S., Jin, Y., Jing, M. Q., Jing, X. M., Johansson, T., Kabana, S., Kalantar-Nayestanaki, N., Kang, X. L., Kang, X. S., Kavatsyuk, M., Ke, B. C., Khachatryan, V., Khoukaz, A., Kiuchi, R., Kolcu, O. B., Kopf, B., Kuessner, M., Kui, X., Kumar, N., Kupsc, A., Kühn, W., Lane, J. J., Larin, P., Lavezzi, L., Lei, T. T., Lei, Z. H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H. B., Li, H. J., Li, H. N., Li, Hui, Li, J. R., Li, J. S., Li, Ke, Li, L. J, Li, L. K., Li, Lei, Li, M. H., Li, P. R., Li, Q. M., Li, Q. X., Li, R., Li, S. X., Li, T., Li, W. D., Li, W. G., Li, X., Li, X. H., Li, X. L., Li, X. Z., Li, Xiaoyu, Li, Y. G., Li, Z. J., Li, Z. X., Liang, C., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Libby, J., Limphirat, A., Lin, C. C., Lin, D. X., Lin, T., Liu, B. J., Liu, B. X., Liu, C., Liu, C. X., Liu, F. H., Liu, Fang, Liu, Feng, Liu, G. M., Liu, H., Liu, H. B., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, L. C., Liu, Lu, Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, T., Liu, W. K., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. D., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Lu, Z. H., Luo, C. L., Luo, M. X., Luo, T., Luo, X. L., Lyu, X. R., Lyu, Y. F., Ma, F. C., Ma, H., Ma, H. L., Ma, J. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, T., Ma, X. T., Ma, X. Y., Ma, Y., Ma, Y. M., Maas, F. E., Maggiora, M., Malde, S., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Miao, H., Min, T. J., Mitchell, R. E., Mo, X. H., Moses, B., Muchnoi, N. Yu., Muskalla, J., Nefedov, Y., Nerling, F., Nie, L. S., Nikolaev, I. B., Ning, Z., Nisar, S., Niu, Q. L., Niu, W. D., Niu, Y., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Patteri, P., Pei, Y. P., Pelizaeus, M., Peng, H. P., Peng, Y. Y., Peters, K., Ping, J. L., Ping, R. G., Plura, S., Prasad, V., Qi, F. Z., Qi, H., Qi, H. R., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qiao, C. F., Qiao, X. K., Qin, J. J., Qin, L. Q., Qin, L. Y., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, Z. H., Redmer, C. F., Ren, K. J., Rivetti, A., Rolo, M., Rong, G., Rosner, Ch., Ruan, S. N., Salone, N., Sarantsev, A., Schelhaas, Y., Schoenning, K., Scodeggio, M., Shan, K. Y., Shan, W., Shan, X. Y., Shang, Z. J, Shangguan, J. F., Shao, L. G., Shao, M., Shen, C. P., Shen, H. F., Shen, W. H., Shen, X. Y., Shi, B. A., Shi, H., Shi, H. C., Shi, J. L., Shi, J. Y., Shi, Q. Q., Shi, S. Y., Shi, X., Song, J. J., Song, T. Z., Song, W. M., Song, Y. J., Song, Y. X., Sosio, S., Spataro, S., Stieler, F., Su, Y. J., Sun, G. B., Sun, G. X., Sun, H., Sun, H. K., Sun, J. F., Sun, K., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, Y., Sun, Y. J., Sun, Y. Z., Sun, Z. Q., Sun, Z. T., Tang, C. J., Tang, G. Y., Tang, J., Tang, M., Tang, Y. A., Tao, L. Y., Tao, Q. T., Tat, M., Teng, J. X., Thoren, V., Tian, W. H., Tian, Y., Tian, Z. F., Uman, I., Wan, Y., Wang, S. J., Wang, B., Wang, B. L., Wang, Bo, Wang, D. Y., Wang, F., Wang, H. J., Wang, J. J., Wang, J. P., Wang, K., Wang, L. L., Wang, M., Wang, Meng, Wang, N. Y., Wang, S., Wang, T., Wang, T. J., Wang, W., Wang, W. P., Wang, X., Wang, X. F., Wang, X. J., Wang, X. L., Wang, X. N., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. L., Wang, Y. N., Wang, Y. Q., Wang, Yaqian, Wang, Yi, Wang, Z., Wang, Z. L., Wang, Z. Y., Wang, Ziyi, Wei, D. H., Weidner, F., Wen, S. P., Wen, Y. R., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, C., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, X. H., Wu, Y., Wu, Y. H., Wu, Y. J., Wu, Z., Xia, L., Xian, X. M., Xiang, B. H., Xiang, T., Xiao, D., Xiao, G. Y., Xiao, S. Y., Xiao, Y. L., Xiao, Z. J., Xie, C., Xie, X. H., Xie, Y., Xie, Y. G., Xie, Y. H., Xie, Z. P., Xing, T. Y., Xu, C. F., Xu, C. J., Xu, G. F., Xu, H. Y., Xu, M., Xu, Q. J., Xu, Q. N., Xu, W., Xu, W. L., Xu, X. P., Xu, Y. C., Xu, Z. P., Xu, Z. S., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yan, X. Q., Yang, H. J., Yang, H. L., Yang, H. X., Yang, Tao, Yang, Y., Yang, Y. F., Yang, Y. X., Yang, Yifan, Yang, Z. W., Yao, Z. P., Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, J. S., Yu, T., Yu, X. D., Yu, Y. C., Yuan, C. Z., Yuan, J., Yuan, L., Yuan, S. C., Yuan, Y., Yuan, Y. J., Yuan, Z. Y., Yue, C. X., Zafar, A. A., Zeng, F. R., Zeng, S. H., Zeng, X., Zeng, Y., Zeng, Y. J., Zhai, X. Y., Zhai, Y. C., Zhan, Y. H., Zhang, A. Q., Zhang, B. L., Zhang, B. X., Zhang, D. H., Zhang, G. Y., Zhang, H., Zhang, H. C., Zhang, H. H., Zhang, H. Q., Zhang, H. R., Zhang, H. Y., Zhang, J., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. S., Zhang, J. W., Zhang, J. X., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, L. M., Zhang, Lei, Zhang, P., Zhang, Q. Y., Zhang, R. Y, Zhang, Shuihan, Zhang, Shulei, Zhang, X. D., Zhang, X. M., Zhang, X. Y., Zhang, Y., Zhang, Y. T., Zhang, Y. H., Zhang, Y. M., Zhang, Yan, Zhang, Yao, Zhang, Z. D., Zhang, Z. H., Zhang, Z. L., Zhang, Z. Y., Zhang, Z. Z., Zhao, G., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, N., Zhao, R. P., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, B. M., Zheng, J. P., Zheng, W. J., Zheng, Y. H., Zhong, B., Zhong, X., Zhou, H., Zhou, J. Y., Zhou, L. P., Zhou, S., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhou, Y. Z., Zhu, J., Zhu, K., Zhu, K. J., Zhu, K. S., Zhu, L., Zhu, L. X., Zhu, S. H., Zhu, S. Q., Zhu, T. J., Zhu, W. D., Zhu, Y. C., Zhu, Z. A., Zou, J. H., and Zu, J.
- Subjects
High Energy Physics - Experiment - Abstract
We measure the Born cross section for the reaction $e^{+}e^{-} \rightarrow \eta h_c$ from $\sqrt{s} = 4.129$ to $4.600$~GeV using data sets collected by the BESIII detector running at the BEPCII collider. A resonant structure in the cross section line shape near 4.200~GeV is observed with a statistical significance of 7$\sigma$. The parameters of this resonance are measured to be \MeasMass\ and \MeasWidth, where the first uncertainties are statistical and the second systematic.
- Published
- 2024
48. Search for the Rare Decays $D_s^+\to h^+(h^{0})e^+e^-$
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Afedulidis, O., Ai, X. C., Aliberti, R., Amoroso, A., An, Q., Bai, Y., Bakina, O., Balossino, I., Ban, Y., Bao, H. -R., Batozskaya, V., Begzsuren, K., Berger, N., Berlowski, M., Bertani, M., Bettoni, D., Bianchi, F., Bianco, E., Bortone, A., Boyko, I., Briere, R. A., Brueggemann, A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Chang, W. L., Che, G. R., Chelkov, G., Chen, C., Chen, C. H., Chen, Chao, Chen, G., Chen, H. S., Chen, M. L., Chen, S. J., Chen, S. L., Chen, S. M., Chen, T., Chen, X. R., Chen, X. T., Chen, Y. B., Chen, Y. Q., Chen, Z. J., Chen, Z. Y., Choi, S. K., Chu, X., Cibinetto, G., Cossio, F., Cui, J. J., Dai, H. L., Dai, J. P., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, C. Q., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, B., Ding, X. X., Ding, Y., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, M. C., Du, S. X., Duan, Z. H., Egorov, P., Fan, Y. H., Fang, J., Fang, S. S., Fang, W. X., Fang, Y., Fang, Y. Q., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Feng, Y. T., Fischer, K., Fritsch, M., Fu, C. D., Fu, J. L., Fu, Y. W., Gao, H., Gao, Y. N., Gao, Yang, Garbolino, S., Garzia, I., Ge, P. T., Ge, Z. W., Geng, C., Gersabeck, E. M., Gilman, A., Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Gramigna, S., Greco, M., Gu, M. H., Gu, Y. T., Guan, C. Y., Guan, Z. L., Guo, A. Q., Guo, L. B., Guo, M. J., Guo, R. P., Guo, Y. P., Guskov, A., Gutierrez, J., Han, K. L., Han, T. T., Hao, X. Q., Harris, F. A., He, K. K., He, K. L., Heinsius, F. H., Heinz, C. H., Heng, Y. K., Herold, C., Holtmann, T., Hong, P. C., Hou, G. Y., Hou, X. T., Hou, Y. R., Hou, Z. L., Hu, B. Y., Hu, H. M., Hu, J. F., Hu, T., Hu, Y., Huang, G. S., Huang, K. X., Huang, L. Q., Huang, X. T., Huang, Y. P., Hussain, T., Hölzken, F., Hüsken, N, der Wiesche, N. in, Irshad, M., Jackson, J., Janchiv, S., Jeong, J. H., Ji, Q., Ji, Q. P., Ji, W., Ji, X. B., Ji, X. L., Ji, Y. Y., Jia, X. Q., Jia, Z. K., Jiang, D., Jiang, H. B., Jiang, P. C., Jiang, S. S., Jiang, T. J., Jiang, X. S., Jiang, Y., Jiao, J. B., Jiao, J. K., Jiao, Z., Jin, S., Jin, Y., Jing, M. Q., Jing, X. M., Johansson, T., Kabana, S., Kalantar-Nayestanaki, N., Kang, X. L., Kang, X. S., Kavatsyuk, M., Ke, B. C., Khachatryan, V., Khoukaz, A., Kiuchi, R., Kolcu, O. B., Kopf, B., Kuessner, M., Kui, X., Kupsc, A., Kühn, W., Lane, J. J., Larin, P., Lavezzi, L., Lei, T. T., Lei, Z. H., Leithoff, H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H., Li, H. B., Li, H. J., Li, H. N., Li, Hui, Li, J. R., Li, J. S., Li, Ke, Li, L. J, Li, L. K., Li, Lei, Li, M. H., Li, P. R., Li, Q. M., Li, Q. X., Li, R., Li, S. X., Li, T., Li, W. D., Li, W. G., Li, X., Li, X. H., Li, X. L., Li, Xiaoyu, Li, Y. G., Li, Z. J., Li, Z. X., Liang, C., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Liao, Y. P., Libby, J., Limphirat, A., Lin, D. X., Lin, T., Liu, B. J., Liu, B. X., Liu, C., Liu, C. X., Liu, F. H., Liu, Fang, Liu, Feng, Liu, G. M., Liu, H., Liu, H. B., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, L. C., Liu, Lu, Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, T., Liu, W. K., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. D., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Lu, Z. H., Luo, C. L., Luo, M. X., Luo, T., Luo, X. L., Lyu, X. R., Lyu, Y. F., Ma, F. C., Ma, H., Ma, H. L., Ma, J. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, X. T., Ma, X. Y., Ma, Y., Ma, Y. M., Maas, F. E., Maggiora, M., Malde, S., Mangoni, A., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Miao, H., Min, T. J., Mitchell, R. E., Mo, X. H., Moses, B., Muchnoi, N. Yu., Muskalla, J., Nefedov, Y., Nerling, F., Nikolaev, I. B., Ning, Z., Nisar, S., Niu, Q. L., Niu, W. D., Niu, Y., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Patteri, P., Pei, Y. P., Pelizaeus, M., Peng, H. P., Peng, Y. Y., Peters, K., Ping, J. L., Ping, R. G., Plura, S., Prasad, V., Qi, F. Z., Qi, H., Qi, H. R., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qiao, C. F., Qin, J. J., Qin, L. Q., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, S. Q., Qu, Z. H., Redmer, C. F., Ren, K. J., Rivetti, A., Rolo, M., Rong, G., Rosner, Ch., Ruan, S. N., Salone, N., Sarantsev, A., Schelhaas, Y., Schoenning, K., Scodeggio, M., Shan, K. Y., Shan, W., Shan, X. Y., Shangguan, J. F., Shao, L. G., Shao, M., Shen, C. P., Shen, H. F., Shen, W. H., Shen, X. Y., Shi, B. A., Shi, H. C., Shi, J. L., Shi, J. Y., Shi, Q. Q., Shi, R. S., Shi, S. Y., Shi, X., Song, J. J., Song, T. Z., Song, W. M., Song, Y. J., Sosio, S., Spataro, S., Stieler, F., Su, Y. J., Sun, G. B., Sun, G. X., Sun, H., Sun, H. K., Sun, J. F., Sun, K., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, Y., Sun, Y. J., Sun, Y. Z., Sun, Z. Q., Sun, Z. T., Tang, C. J., Tang, G. Y., Tang, J., Tang, Y. A., Tao, L. Y., Tao, Q. T., Tat, M., Teng, J. X., Thoren, V., Tian, W. H., Tian, Y., Tian, Z. F., Uman, I., Wan, Y., Wang, S. J., Wang, B., Wang, B. L., Wang, Bo, Wang, D. Y., Wang, F., Wang, H. J., Wang, J. P., Wang, K., Wang, L. L., Wang, M., Wang, Meng, Wang, N. Y., Wang, S., Wang, T., Wang, T. J., Wang, W., Wang, W. P., Wang, X., Wang, X. F., Wang, X. J., Wang, X. L., Wang, X. N., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. L., Wang, Y. N., Wang, Y. Q., Wang, Yaqian, Wang, Yi, Wang, Z., Wang, Z. L., Wang, Z. Y., Wang, Ziyi, Wei, D., Wei, D. H., Weidner, F., Wen, S. P., Wen, Y. R., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, C., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, X. H., Wu, Y., Wu, Y. H., Wu, Y. J., Wu, Z., Xia, L., Xian, X. M., Xiang, B. H., Xiang, T., Xiao, D., Xiao, G. Y., Xiao, S. Y., Xiao, Y. L., Xiao, Z. J., Xie, C., Xie, X. H., Xie, Y., Xie, Y. G., Xie, Y. H., Xie, Z. P., Xing, T. Y., Xu, C. F., Xu, C. J., Xu, G. F., Xu, H. Y., Xu, Q. J., Xu, Q. N., Xu, W., Xu, W. L., Xu, X. P., Xu, Y. C., Xu, Z. P., Xu, Z. S., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yan, X. Q., Yang, H. J., Yang, H. L., Yang, H. X., Yang, Tao, Yang, Y., Yang, Y. F., Yang, Y. X., Yang, Yifan, Yang, Z. W., Yao, Z. P., Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, J. S., Yu, T., Yu, X. D., Yuan, C. Z., Yuan, J., Yuan, L., Yuan, S. C., Yuan, Y., Yuan, Z. Y., Yue, C. X., Zafar, A. A., Zeng, F. R., Zeng, S. H., Zeng, X., Zeng, Y., Zeng, Y. J., Zhai, X. Y., Zhai, Y. C., Zhan, Y. H., Zhang, A. Q., Zhang, B. L., Zhang, B. X., Zhang, D. H., Zhang, G. Y., Zhang, H., Zhang, H. C., Zhang, H. H., Zhang, H. Q., Zhang, H. Y., Zhang, J., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. W., Zhang, J. X., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, L. M., Zhang, Lei, Zhang, P., Zhang, Q. Y., Zhang, Shuihan, Zhang, Shulei, Zhang, X. D., Zhang, X. M., Zhang, X. Y., Zhang, Y., Zhang, Y. T., Zhang, Y. H., Zhang, Y. M., Zhang, Yan, Zhang, Yao, Zhang, Z. D., Zhang, Z. H., Zhang, Z. L., Zhang, Z. Y., Zhao, G., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, R. P., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, J. P., Zheng, W. J., Zheng, Y. H., Zhong, B., Zhong, X., Zhou, H., Zhou, J. Y., Zhou, L. P., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhou, Y. Z., Zhu, J., Zhu, K., Zhu, K. J., Zhu, L., Zhu, L. X., Zhu, S. H., Zhu, S. Q., Zhu, T. J., Zhu, W. J., Zhu, Y. C., Zhu, Z. A., Zou, J. H., and Zu, J.
- Subjects
High Energy Physics - Experiment - Abstract
Using 7.33~fb$^{-1}$ of $e^{+}e^{-}$ collision data collected by the BESIII detector at center-of-mass energies in the range of $\sqrt{s}=4.128 - 4.226$~GeV, we search for the rare decays $D_{s}^+\to h^+(h^{0})e^{+}e^{-}$, where $h$ represents a kaon or pion. By requiring the $e^{+}e^{-}$ invariant mass to be consistent with a $\phi(1020)$, $0.98
- Published
- 2024
- Full Text
- View/download PDF
49. Map Optical Properties to Subwavelength Structures Directly via a Diffusion Model
- Author
-
Rao, Shijie, Cui, Kaiyu, Huang, Yidong, Yang, Jiawei, Li, Yali, Wang, Shengjin, Feng, Xue, Liu, Fang, and Zhang, Wei
- Subjects
Physics - Optics ,Computer Science - Artificial Intelligence - Abstract
Subwavelength photonic structures and metamaterials provide revolutionary approaches for controlling light. The inverse design methods proposed for these subwavelength structures are vital to the development of new photonic devices. However, most of the existing inverse design methods cannot realize direct mapping from optical properties to photonic structures but instead rely on forward simulation methods to perform iterative optimization. In this work, we exploit the powerful generative abilities of artificial intelligence (AI) and propose a practical inverse design method based on latent diffusion models. Our method maps directly the optical properties to structures without the requirement of forward simulation and iterative optimization. Here, the given optical properties can work as "prompts" and guide the constructed model to correctly "draw" the required photonic structures. Experiments show that our direct mapping-based inverse design method can generate subwavelength photonic structures at high fidelity while following the given optical properties. This may change the method used for optical design and greatly accelerate the research on new photonic devices.
- Published
- 2024
50. Search for $C$-even states decaying to $D_{s}^{\pm}D_{s}^{*\mp}$ with masses between $4.08$ and $4.32~\mathrm{GeV}/c^{2}$
- Author
-
BESIII Collaboration, Ablikim, M., Achasov, M. N., Adlarson, P., Afedulidis, O., Ai, X. C., Aliberti, R., Amoroso, A., An, Q., Bai, Y., Bakina, O., Balossino, I., Ban, Y., Bao, H. -R., Batozskaya, V., Begzsuren, K., Berger, N., Berlowski, M., Bertani, M., Bettoni, D., Bianchi, F., Bianco, E., Bortone, A., Boyko, I., Briere, R. A., Brueggemann, A., Cai, H., Cai, X., Calcaterra, A., Cao, G. F., Cao, N., Cetin, S. A., Chang, J. F., Che, G. R., Chelkov, G., Chen, C., Chen, C. H., Chen, Chao, Chen, G., Chen, H. S., Chen, H. Y., Chen, M. L., Chen, S. J., Chen, S. L., Chen, S. M., Chen, T., Chen, X. R., Chen, X. T., Chen, Y. B., Chen, Y. Q., Chen, Z. J., Chen, Z. Y., Choi, S. K., Cibinetto, G., Cossio, F., Cui, J. J., Dai, H. L., Dai, J. P., Dbeyssi, A., de Boer, R. E., Dedovich, D., Deng, C. Q., Deng, Z. Y., Denig, A., Denysenko, I., Destefanis, M., De Mori, F., Ding, B., Ding, X. X., Ding, Y., Dong, J., Dong, L. Y., Dong, M. Y., Dong, X., Du, M. C., Du, S. X., Duan, Y. Y., Duan, Z. H., Egorov, P., Fan, Y. H., Fang, J., Fang, S. S., Fang, W. X., Fang, Y., Fang, Y. Q., Farinelli, R., Fava, L., Feldbauer, F., Felici, G., Feng, C. Q., Feng, J. H., Feng, Y. T., Fritsch, M., Fu, C. D., Fu, J. L., Fu, Y. W., Gao, H., Gao, X. B., Gao, Y. N., Gao, Yang, Garbolino, S., Garzia, I., Ge, L., Ge, P. T., Ge, Z. W., Geng, C., Gersabeck, E. M., Gilman, A., Goetzen, K., Gong, L., Gong, W. X., Gradl, W., Gramigna, S., Greco, M., Gu, M. H., Gu, Y. T., Guan, C. Y., Guan, Z. L., Guo, A. Q., Guo, L. B., Guo, M. J., Guo, R. P., Guo, Y. P., Guskov, A., Gutierrez, J., Han, K. L., Han, T. T., Hanisch, F., Hao, X. Q., Harris, F. A., He, K. K., He, K. L., Heinsius, F. H., Heinz, C. H., Heng, Y. K., Herold, C., Holtmann, T., Hong, P. C., Hou, G. Y., Hou, X. T., Hou, Y. R., Hou, Z. L., Hu, B. Y., Hu, H. M., Hu, J. F., Hu, S. L., Hu, T., Hu, Y., Huang, G. S., Huang, K. X., Huang, L. Q., Huang, X. T., Huang, Y. P., Hussain, T., Hölzken, F., Hüsken, N, der Wiesche, N. in, Jackson, J., Janchiv, S., Jeong, J. H., Ji, Q., Ji, Q. P., Ji, W., Ji, X. B., Ji, X. L., Ji, Y. Y., Jia, X. Q., Jia, Z. K., Jiang, D., Jiang, H. B., Jiang, P. C., Jiang, S. S., Jiang, T. J., Jiang, X. S., Jiang, Y., Jiao, J. B., Jiao, J. K., Jiao, Z., Jin, S., Jin, Y., Jing, M. Q., Jing, X. M., Johansson, T., Kabana, S., Kalantar-Nayestanaki, N., Kang, X. L., Kang, X. S., Kavatsyuk, M., Ke, B. C., Khachatryan, V., Khoukaz, A., Kiuchi, R., Kolcu, O. B., Kopf, B., Kuessner, M., Kui, X., Kumar, N., Kupsc, A., Kühn, W., Lane, J. J., Larin, P., Lavezzi, L., Lei, T. T., Lei, Z. H., Lellmann, M., Lenz, T., Li, C., Li, C. H., Li, Cheng, Li, D. M., Li, F., Li, G., Li, H. B., Li, H. J., Li, H. N., Li, Hui, Li, J. R., Li, J. S., Li, Ke, Li, L. J, Li, L. K., Li, Lei, Li, M. H., Li, P. R., Li, Q. M., Li, Q. X., Li, R., Li, S. X., Li, T., Li, W. D., Li, W. G., Li, X., Li, X. H., Li, X. L., Li, X. Z., Li, Xiaoyu, Li, Y. G., Li, Z. J., Li, Z. X., Li, Z. Y., Liang, C., Liang, H., Liang, Y. F., Liang, Y. T., Liao, G. R., Liao, L. Z., Libby, J., Limphirat, A., Lin, C. C., Lin, D. X., Lin, T., Liu, B. J., Liu, B. X., Liu, C., Liu, C. X., Liu, F. H., Liu, Fang, Liu, Feng, Liu, G. M., Liu, H., Liu, H. B., Liu, H. M., Liu, Huanhuan, Liu, Huihui, Liu, J. B., Liu, J. Y., Liu, K., Liu, K. Y., Liu, Ke, Liu, L., Liu, L. C., Liu, Lu, Liu, M. H., Liu, P. L., Liu, Q., Liu, S. B., Liu, T., Liu, W. K., Liu, W. M., Liu, X., Liu, Y., Liu, Y. B., Liu, Z. A., Liu, Z. D., Liu, Z. Q., Lou, X. C., Lu, F. X., Lu, H. J., Lu, J. G., Lu, X. L., Lu, Y., Lu, Y. P., Lu, Z. H., Luo, C. L., Luo, M. X., Luo, T., Luo, X. L., Lyu, X. R., Lyu, Y. F., Ma, F. C., Ma, H., Ma, H. L., Ma, J. L., Ma, L. L., Ma, M. M., Ma, Q. M., Ma, R. Q., Ma, T., Ma, X. T., Ma, X. Y., Ma, Y., Ma, Y. M., Maas, F. E., Maggiora, M., Malde, S., Mao, Y. J., Mao, Z. P., Marcello, S., Meng, Z. X., Messchendorp, J. G., Mezzadri, G., Miao, H., Min, T. J., Mitchell, R. E., Mo, X. H., Moses, B., Muchnoi, N. Yu., Muskalla, J., Nefedov, Y., Nerling, F., Nie, L. S., Nikolaev, I. B., Ning, Z., Nisar, S., Niu, Q. L., Niu, W. D., Niu, Y., Olsen, S. L., Ouyang, Q., Pacetti, S., Pan, X., Pan, Y., Pathak, A., Patteri, P., Pei, Y. P., Pelizaeus, M., Peng, H. P., Peng, Y. Y., Peters, K., Ping, J. L., Ping, R. G., Plura, S., Prasad, V., Qi, F. Z., Qi, H., Qi, H. R., Qi, M., Qi, T. Y., Qian, S., Qian, W. B., Qiao, C. F., Qiao, X. K., Qin, J. J., Qin, L. Q., Qin, L. Y., Qin, X. S., Qin, Z. H., Qiu, J. F., Qu, Z. H., Redmer, C. F., Ren, K. J., Rivetti, A., Rolo, M., Rong, G., Rosner, Ch., Ruan, S. N., Salone, N., Sarantsev, A., Schelhaas, Y., Schoenning, K., Scodeggio, M., Shan, K. Y., Shan, W., Shan, X. Y., Shang, Z. J, Shangguan, J. F., Shao, L. G., Shao, M., Shen, C. P., Shen, H. F., Shen, W. H., Shen, X. Y., Shi, B. A., Shi, H., Shi, H. C., Shi, J. L., Shi, J. Y., Shi, Q. Q., Shi, S. Y., Shi, X., Song, J. J., Song, T. Z., Song, W. M., Song, Y. J., Song, Y. X., Sosio, S., Spataro, S., Stieler, F., Su, Y. J., Sun, G. B., Sun, G. X., Sun, H., Sun, H. K., Sun, J. F., Sun, K., Sun, L., Sun, S. S., Sun, T., Sun, W. Y., Sun, Y., Sun, Y. J., Sun, Y. Z., Sun, Z. Q., Sun, Z. T., Tang, C. J., Tang, G. Y., Tang, J., Tang, M., Tang, Y. A., Tao, L. Y., Tao, Q. T., Tat, M., Teng, J. X., Thoren, V., Tian, W. H., Tian, Y., Tian, Z. F., Uman, I., Wan, Y., Wang, S. J., Wang, B., Wang, B. L., Wang, Bo, Wang, D. Y., Wang, F., Wang, H. J., Wang, J. J., Wang, J. P., Wang, K., Wang, L. L., Wang, M., Wang, Meng, Wang, N. Y., Wang, S., Wang, T., Wang, T. J., Wang, W., Wang, W. P., Wang, X., Wang, X. F., Wang, X. J., Wang, X. L., Wang, X. N., Wang, Y., Wang, Y. D., Wang, Y. F., Wang, Y. L., Wang, Y. N., Wang, Y. Q., Wang, Yaqian, Wang, Yi, Wang, Z., Wang, Z. L., Wang, Z. Y., Wang, Ziyi, Wei, D. H., Weidner, F., Wen, S. P., Wen, Y. R., Wiedner, U., Wilkinson, G., Wolke, M., Wollenberg, L., Wu, C., Wu, J. F., Wu, L. H., Wu, L. J., Wu, X., Wu, X. H., Wu, Y., Wu, Y. H., Wu, Y. J., Wu, Z., Xia, L., Xian, X. M., Xiang, B. H., Xiang, T., Xiao, D., Xiao, G. Y., Xiao, S. Y., Xiao, Y. L., Xiao, Z. J., Xie, C., Xie, X. H., Xie, Y., Xie, Y. G., Xie, Y. H., Xie, Z. P., Xing, T. Y., Xu, C. F., Xu, C. J., Xu, G. F., Xu, H. Y., Xu, M., Xu, Q. J., Xu, Q. N., Xu, W., Xu, W. L., Xu, X. P., Xu, Y. C., Xu, Z. P., Xu, Z. S., Yan, F., Yan, L., Yan, W. B., Yan, W. C., Yan, X. Q., Yang, H. J., Yang, H. L., Yang, H. X., Yang, Tao, Yang, Y., Yang, Y. F., Yang, Y. X., Yang, Yifan, Yang, Z. W., Yao, Z. P., Ye, M., Ye, M. H., Yin, J. H., You, Z. Y., Yu, B. X., Yu, C. X., Yu, G., Yu, J. S., Yu, T., Yu, X. D., Yu, Y. C., Yuan, C. Z., Yuan, J., Yuan, L., Yuan, S. C., Yuan, Y., Yuan, Y. J., Yuan, Z. Y., Yue, C. X., Zafar, A. A., Zeng, F. R., Zeng, S. H., Zeng, X., Zeng, Y., Zeng, Y. J., Zhai, X. Y., Zhai, Y. C., Zhan, Y. H., Zhang, A. Q., Zhang, B. L., Zhang, B. X., Zhang, D. H., Zhang, G. Y., Zhang, H., Zhang, H. C., Zhang, H. H., Zhang, H. Q., Zhang, H. R., Zhang, H. Y., Zhang, J., Zhang, J. J., Zhang, J. L., Zhang, J. Q., Zhang, J. S., Zhang, J. W., Zhang, J. X., Zhang, J. Y., Zhang, J. Z., Zhang, Jianyu, Zhang, L. M., Zhang, Lei, Zhang, P., Zhang, Q. Y., Zhang, R. Y, Zhang, Shuihan, Zhang, Shulei, Zhang, X. D., Zhang, X. M., Zhang, X. Y., Zhang, Y., Zhang, Y. T., Zhang, Y. H., Zhang, Y. M., Zhang, Yan, Zhang, Yao, Zhang, Z. D., Zhang, Z. H., Zhang, Z. L., Zhang, Z. Y., Zhang, Z. Z., Zhao, G., Zhao, J. Y., Zhao, J. Z., Zhao, Lei, Zhao, Ling, Zhao, M. G., Zhao, N., Zhao, R. P., Zhao, S. J., Zhao, Y. B., Zhao, Y. X., Zhao, Z. G., Zhemchugov, A., Zheng, B., Zheng, B. M., Zheng, J. P., Zheng, W. J., Zheng, Y. H., Zhong, B., Zhong, X., Zhou, H., Zhou, J. Y., Zhou, L. P., Zhou, S., Zhou, X., Zhou, X. K., Zhou, X. R., Zhou, X. Y., Zhou, Y. Z., Zhu, J., Zhu, K., Zhu, K. J., Zhu, K. S., Zhu, L., Zhu, L. X., Zhu, S. H., Zhu, S. Q., Zhu, T. J., Zhu, W. D., Zhu, Y. C., Zhu, Z. A., Zou, J. H., and Zu, J.
- Subjects
High Energy Physics - Experiment ,High Energy Physics - Phenomenology - Abstract
Six $C$-even states, denoted as $X$, with quantum numbers $J^{PC}=0^{-+}$, $1^{\pm+}$, or $2^{\pm+}$, are searched for via the $e^+e^-\to\gamma D_{s}^{\pm}D_{s}^{*\mp}$ process using $(1667.39\pm8.84)~\mathrm{pb}^{-1}$ of $e^+e^-$ collision data collected with the BESIII detector operating at the BEPCII storage ring at center-of-mass energy of $\sqrt{s}=(4681.92\pm0.30)~\mathrm{MeV}$. No statistically significant signal is observed in the mass range from $4.08$ to $4.32~\mathrm{GeV}/c^{2}$. The upper limits of $\sigma[e^+e^- \to \gamma X] \cdot \mathcal{B}[X \to D_{s}^{\pm} D_{s}^{*\mp}]$ at a $90\%$ confidence level are determined.
- Published
- 2024
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.