A Zero-Energy Cognitive Distillation Framework for 6G-Edge Autonomous Ubiquitous Learning Ecosystems
Rufat AzizovAssociate Professor, Azerbaijan State Oil and Industry University, Baku, Azerbaijan. rufat.azizov@sdu.edu.az0000-0001-8662-1550
Khamid MannopovAssociate Professor, Tashkent State Technical University, Tashkent, Uzbekistan. x.mannopov@mail.ru0000-0003-4648-7777
Oleg KimLecturer, Jizzakh State Pedagogical University, Jizzakh, Uzbekistan. olkimleg@gmail.com0009-0002-3311-2896
Sitora DaniyarovaAssistant Teacher, Tashkent University of Information Technologies named after Muhammad al-Khwarizmi, Tashkent, Uzbekistan. daniyarova@mail.ru0009-0003-8206-7283
Feruza MurtazayevaAssociate Professor, Bukhara State University, Bukhara, Uzbekistan. feruza79.79@list.ru0000-0002-6415-336X
Manzura RustamovaAssociate Professor, Kimyo International University in Tashkent, Tashkent, Uzbekistan. m.rustamova@kiut.uz0009-0004-9377-5936
I.B. SapaevTashkent Institute of Irrigation and Agricultural Mechanization Engineers, National Research University, Tashkent, Uzbekistan; Scientific researcher of the University of Tashkent for Applied Science, Tashkent, Uzbekistan. sapaevibrokhim@gmail.com0000-0003-2365-1554
model training and knowledge transfer, however, require significant energy and pose sustainability issues. Most existing frameworks use distributed computing and high-energy distillation, creating scalability issues in resource-limited environments. This paper presents the Zero-Energy Cognitive Distillation (ZECD) Framework for 6G-edge ecosystems, offering energy-efficient knowledge transfer for superior learning performance. This framework includes a new cognitive distillation mechanism that dynamically filters knowledge, alleviating the need for excessive computation. Additionally, the proposed method combines lightweight edge-based student models with adaptive teacher selection and ambient energy harvesting with event-driven computation. This combination is intended to decrease power usage. Evaluations showed that the framework achieved a 42.7% reduction in energy consumption, a 35.3% decrease in latency, and a 28.9% increase in the speed of achieving model convergence over previously developed methods. Learning accuracy increased 11.6%. Robustness also improved under dynamic network conditions, with a decrease in performance of less than 5% when nodes failed. The results show that the proposed model prioritizes both energy and learning performance in large-scale 6G autonomous learning systems. This research offers an energy-efficient, scalable, and sustainable model for future intelligent systems and proposes the first energy-sensitive cognitive learning systems in distributed edge environments.