Cite
ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers
MLA
İslamoğlu, Gamze, et al. ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers. 2023. EBSCOhost, https://doi.org/10.1109/ISLPED58423.2023.10244348.
APA
İslamoğlu, G., Scherer, M., Paulin, G., Fischer, T., Jung, V. J. B., Garofalo, A., & Benini, L. (2023). ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers. https://doi.org/10.1109/ISLPED58423.2023.10244348
Chicago
İslamoğlu, Gamze, Moritz Scherer, Gianna Paulin, Tim Fischer, Victor J. B. Jung, Angelo Garofalo, and Luca Benini. 2023. “ITA: An Energy-Efficient Attention and Softmax Accelerator for Quantized Transformers.” doi:10.1109/ISLPED58423.2023.10244348.