×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

Exploring Long Short-Term Memory-based Encoder-Decoder Framework for Extractive Text Summarization

Abstract

Exploring Long Short-Term Memory-based Encoder-Decoder Framework for Extractive Text Summarization

Bilal Somar

Incoming article date: 22.04.2023

In this article we present a study on Natural Language Processing (NLP) and Machine Learning (ML) techniques, specifically focusing on deep learning algorithms. The research explores the application of Long Short-Term Memory (LSTM) models with attention mechanisms for text summarization tasks. The dataset used for experimentation consists of news articles and their corresponding summaries. The article discusses the preprocessing steps, including text cleaning and tokenization, performed on the data. The study also investigates the impact of different hyperparameters on the model's performance. The results demonstrate the effectiveness of the proposed approach in generating concise summaries from lengthy texts. The findings contribute to the advancement of Natural Language Processing and Machine Learning techniques for text summarization.

Keywords: extractive text summarization, sequence-to-sequence, long short-term memory, encoder_decoder, summarization model, natural language processing, machine learning, deep learning, attention mechanism