Tagging: Explore different tokenizers for punctuation

Issue #9 resolved
Tomas Rojo Hernandez created an issue

Currently using str.split() method, should try using other more advanced tokenizers like nltk's word_tokenize to solve punctuation issue. Related to #1

Comments (3)

  1. Log in to comment