\item For the ranking of arguments, we measured the semantic similarity
\item For the ranking of arguments, we measured the semantic similarity
between premise and conclusion
between premise and conclusion
\item Here each word of the argument in embedded in a vector space and then the
\item Here each word of the argument in embedded in a vector space and the
average of the vectors of the argument is calculated
average of the vectors of the argument is calculated
\item The similarity of a premise and a conclusion is the calculated by the
\item The similarity of a premise and a conclusion is the calculated by the
angle between them
angle between them
...
@@ -26,7 +26,7 @@ contextualized word representations,”}
...
@@ -26,7 +26,7 @@ contextualized word representations,”}
\begin{itemize}
\begin{itemize}
\item Another approach to rank the argument is to measure how positive the tone
\item Another approach to rank the argument is to measure how positive the tone
of the premises is
of the premises is
\item For this, we use a sentiment neural network based on FastText\footnote{A. Joulin, E. Grave, P. Bojanowski, and T. Mikolov, “Bag of tricks for efficient text classification,”}, which was
\item For this, we used a sentiment neural network based on FastText\footnote{A. Joulin, E. Grave, P. Bojanowski, and T. Mikolov, “Bag of tricks for efficient text classification,”}, which was