Commits

Hong Minhee committed 2395940

Using `unicode` instead of `str` for token tag.

  • Participants
  • Parent commits 3468fbd

Comments (0)

Files changed (1)

File kong/lexer.py

 
        >>> t = lambda s: list(tokenize(s))
        >>> t(u'a<-func  (123)')  # doctest: +NORMALIZE_WHITESPACE
-       [kong.lexer.Token(u'identifier', u'a', 0),
-        kong.lexer.Token(u'arrow', u'<-', 1),
-        kong.lexer.Token(u'identifier', u'func', 3),
-        kong.lexer.Token(u'space', u'  ', 7),
-        kong.lexer.Token(u'parenthesis', u'(', 9),
-        kong.lexer.Token(u'number', u'123', 10),
-        kong.lexer.Token(u'parenthesis', u')', 13)]
+       [kong.lexer.Token('identifier', u'a', 0),
+        kong.lexer.Token('arrow', u'<-', 1),
+        kong.lexer.Token('identifier', u'func', 3),
+        kong.lexer.Token('space', u'  ', 7),
+        kong.lexer.Token('parenthesis', u'(', 9),
+        kong.lexer.Token('number', u'123', 10),
+        kong.lexer.Token('parenthesis', u')', 13)]
 
     It supports streaming as well:
 
 
        >>> stream = [u'a(12', u'3)\nb<', u'-c * 123']
        >>> t(stream)  # doctest: +NORMALIZE_WHITESPACE
-       [kong.lexer.Token(u'identifier', u'a', 0),
-        kong.lexer.Token(u'parenthesis', u'(', 1),
-        kong.lexer.Token(u'number', u'123', 2),
-        kong.lexer.Token(u'parenthesis', u')', 5),
-        kong.lexer.Token(u'newline', u'\n', 6),
-        kong.lexer.Token(u'identifier', u'b', 7),
-        kong.lexer.Token(u'arrow', u'<-', 8),
-        kong.lexer.Token(u'identifier', u'c', 10),
-        kong.lexer.Token(u'space', u' ', 11),
-        kong.lexer.Token(u'identifier', u'*', 12),
-        kong.lexer.Token(u'space', u' ', 13),
-        kong.lexer.Token(u'number', u'123', 14)]
+       [kong.lexer.Token('identifier', u'a', 0),
+        kong.lexer.Token('parenthesis', u'(', 1),
+        kong.lexer.Token('number', u'123', 2),
+        kong.lexer.Token('parenthesis', u')', 5),
+        kong.lexer.Token('newline', u'\n', 6),
+        kong.lexer.Token('identifier', u'b', 7),
+        kong.lexer.Token('arrow', u'<-', 8),
+        kong.lexer.Token('identifier', u'c', 10),
+        kong.lexer.Token('space', u' ', 11),
+        kong.lexer.Token('identifier', u'*', 12),
+        kong.lexer.Token('space', u' ', 13),
+        kong.lexer.Token('number', u'123', 14)]
 
     :param stream: input stream
     :type stream: :class:`collections.Iterable`
         d = m.groupdict()
         for tag, string in d.iteritems():
             if string:
-                return Token(tag, string, i)
+                return Token(str(tag), string, i)
     i = 0
     s = ''
     buf = StringIO.StringIO()