219 results on '"knowledge distillation"'
Search Results
2. WeldNet: a lightweight deep learning model for welding defect recognition
3. Two-stage model fusion scheme based on knowledge distillation for stragglers in federated learning
4. A Memory-Assisted Knowledge Transferring Framework with Curriculum Anticipation for Weakly Supervised Online Activity Detection
5. A simulated two-stream network via multilevel distillation of reviewed features and decoupled logits for video action recognition
6. Multi-modal CrossViT using 3D spatial information for visual localization
7. Spatial-frequency attention-based optical and scene flow with cross-modal knowledge distillation
8. An enterprise portrait tag extraction method based on context embedding and knowledge distillation
9. KD3mt: knowledge distillation-driven dynamic mixer transformer for medical image fusion
10. KDTL: knowledge-distilled transfer learning framework for diagnosing mental disorders using EEG spectrograms
11. Unsupervised anomaly detection and localization via bidirectional knowledge distillation
12. Variational AdaBoost knowledge distillation for skin lesion classification in dermatology images
13. EnParaNet: a novel deep learning architecture for faster prediction using low-computational resource devices
14. Lightweight image dehazing networks based on soft knowledge distillation
15. Masked face recognition based on knowledge distillation and convolutional self-attention network
16. Improving Image Anomaly Localization: A Multi-branch and Skip Connection Framework
17. DE-DFKD: diversity enhancing data-free knowledge distillation
18. Design of a knowledge distillation network for wifi-based indoor localization
19. Targeted training for numerical reasoning with large language models
20. Iterative filter pruning with combined feature maps and knowledge distillation
21. Unsupervised dual-teacher knowledge distillation for pseudo-label refinement in domain adaptive person re-identification
22. An ensemble of self-supervised teachers for minimal student model with auto-tuned hyperparameters via improved Bayesian optimization
23. LSF-IDM: Deep learning-based lightweight semantic fusion intrusion detection model for automotive
24. Neural models for semantic analysis of handwritten document images
25. Efficient skin lesion segmentation with boundary distillation
26. Descriptor Distillation: A Teacher-Student-Regularized Framework for Learning Local Descriptors
27. Sketch Classification and Sketch Based Image Retrieval Using ViT with Self-Distillation for Few Samples
28. Swarm mutual learning
29. GFD-SSL: generative federated knowledge distillation-based semi-supervised learning
30. PDD: Pruning Neural Networks During Knowledge Distillation
31. Efficiency-oriented approaches for self-supervised speech representation learning
32. Knowledge distillation-based approach for object detection in thermal images during adverse weather conditions
33. MCAD: Multi-classification anomaly detection with relational knowledge distillation
34. Dual knowledge distillation for visual tracking with teacher–student network
35. A lightweight road crack detection algorithm based on improved YOLOv7 model
36. Teacher-student guided knowledge distillation for unsupervised convolutional neural network-based speckle tracking in ultrasound strain elastography
37. Dual-student knowledge distillation for visual anomaly detection
38. Cross-Architecture Knowledge Distillation
39. Knowledge Distillation Meets Open-Set Semi-supervised Learning
40. Incremental Distillation Physics-Informed Neural Network (IDPINN) Accurately Models the Evolution of Optical Solitons
41. Knowledge Distillation via Hierarchical Matching for Small Object Detection
42. Multi-resolution Twinned Residual Auto-Encoders (MR-TRAE)—A Novel DL Model for Image Multi-resolution
43. Self-supervised extracted contrast network for facial expression recognition
44. A lightweight deep learning model with knowledge distillation for pulmonary diseases detection in chest X-rays
45. Lightweight 3D-StudentNet for defending against face replay attacks
46. Global Instance Relation Distillation for convolutional neural network compression
47. Knowledge distillation based on projector integration and classifier sharing
48. Designing an improved deep learning-based model for COVID-19 recognition in chest X-ray images: a knowledge distillation approach
49. T-KD: two-tier knowledge distillation for a lightweight underwater fish species classification model
50. Federated learning for feature-fusion based requirement classification
Catalog
Books, media, physical & digital resources
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.