This work focuses on the efficiency of the knowledge distillation approach in generating a lightweight yet powerful BERT-based model for natural language processing (NLP) applications. After the model creation. we applied the resulting model. LastBERT. https://www.duospiritalis.com/