command options and redirection confusion
Issue #7
resolved
This was unexpected based on typical bash shell behavior:
>>> p = run('echo foo | gzip -9 > debug.out')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "sarge/__init__.py", line 1265, in run
with Pipeline(cmd, **kwargs) as p:
File "sarge/__init__.py", line 873, in __init__
t = command_line_parser.parse(source, posix=posix)
File "sarge/__init__.py", line 730, in parse
result = self.parse_list()
File "sarge/__init__.py", line 734, in parse_list
parts = [self.parse_pipeline()]
File "sarge/__init__.py", line 750, in parse_pipeline
parts = [self.parse_logical()]
File "sarge/__init__.py", line 783, in parse_logical
part = self.parse_command()
File "sarge/__init__.py", line 800, in parse_command
node = self.parse_command_part()
File "sarge/__init__.py", line 849, in parse_command_part
'stderr, not %s' % list(d.keys()))
ValueError: semantics: can only redirect stdout and stderr, not [-9]
Comments (3)
-
Account Deleted -
repo owner Thanks for the report. I also implemented an interim fix, but I think the answer is to change the tokenizer. I have been using a variant of the stdlib's tokenizer, but I fear it is too inflexible. I'm working on an approach which, if it doesn't work, may mean that a new tokenizer is needed :-(
-
repo owner - changed status to resolved
Closes
#7: Fixed bugs in handling whitespace and redirections.→ <<cset c515eb5cccc9>>
- Log in to comment
Playing around with the tokenizer I changed next_token() to only treat non-negative numbers as a token type of 'number':
And this fixes the above command. However, redirection is still broken for other use cases. For example:
Whereas with normal bash (4.1.2) I get expected results:
Perhaps whitespace could be taken into account here? Or allow quoting of the arguments to disable treating arguments as file descriptors.
I.e. I would expect the current sarge results from a bash shell by the syntax:
But quoting the second argument works for bash
But not for sarge: