Search

Your search keyword '"knowledge distillation"' showing total 153 results

Search Constraints

Start Over You searched for: Descriptor "knowledge distillation" Remove constraint Descriptor: "knowledge distillation" Publisher springer international publishing Remove constraint Publisher: springer international publishing
153 results on '"knowledge distillation"'

Search Results

13. FPD: Feature Pyramid Knowledge Distillation

14. Class-Incremental Learning with Multiscale Distillation for Weakly Supervised Temporal Action Localization

15. FairDistillation: Mitigating Stereotyping in Language Models

19. Effective SNOMED-CT Concept Classification from Natural Language using Knowledge Distillation

20. Design and Develop Hardware Aware DNN for Faster Inference

28. Cross-Language Plagiarism Detection: A Case Study of European Languages Academic Works

29. Debias the Black-Box: A Fair Ranking Framework via Knowledge Distillation

30. Distill-AER: Fine-Grained Address Entity Recognition from Spoken Dialogue via Knowledge Distillation

31. Collaborative Multiple-Student Single-Teacher for Online Learning

32. Text CAPTCHA Traversal via Knowledge Distillation of Convolutional Neural Networks: Exploring the Impact of Color Channels Selection

33. COVID-19 Classification from Chest X-rays Based on Attention and Knowledge Distillation

34. Attention-Fused CNN Model Compression with Knowledge Distillation for Brain Tumor Segmentation

35. Deep-to-Bottom Weights Decay: A Systemic Knowledge Review Learning Technique for Transformer Layers in Knowledge Distillation

36. Realtime Optical Flow Estimation on Vein and Artery Ultrasound Sequences Based on Knowledge-Distillation

37. A Methodology for Enabling NLP Capabilities on Edge and Low-Resource Devices

38. Elastic Deep Learning Using Knowledge Distillation with Heterogeneous Computing Resources

40. Overcoming Forgetting in Local Adaptation of Federated Learning Model

42. Rapid and High-Purity Seed Grading Based on Pruned Deep Convolutional Neural Network

43. Using Explainable Boosting Machine to Compare Idiographic and Nomothetic Approaches for Ecological Momentary Assessment Data

44. Robust Multi-model Personalized Federated Learning via Model Distillation

45. Building Portable ECG Classification Model with Cross-Dimension Knowledge Distillation

46. LightBERT: A Distilled Chinese BERT Model

47. A Privacy Knowledge Transfer Method for Clinical Concept Extraction

48. WiFi-Based Multi-task Sensing

Catalog

Books, media, physical & digital resources