Jason R. Coombs  committed 070dc6e

Fixes #10639: should not convert newlines now will use the newline detected in the original file and will report an error if mixed newlines are encountered.

  • Participants
  • Parent commits d11a1a3
  • Branches 3.2

Comments (0)

Files changed (2)

   "make install" creates symlinks in --prefix bin for the "-32" files in the
   framework bin directory like the installer does.
+- Issue #10639: no longer converts newlines and will raise
+  an error if attempting to convert a file with mixed newlines.

File Tools/scripts/

 The backup file is a copy of the one that is being reindented. The ".bak"
 file is generated with shutil.copy(), but some corner cases regarding
-user/group and permissions could leave the backup file more readable that
+user/group and permissions could leave the backup file more readable than
 you'd prefer. You can always use the --nobackup option to prevent this.
     if verbose:
         print("checking", file, "...", end=' ')
-    with  open(file, 'rb') as f:
+    with open(file, 'rb') as f:
         encoding, _ = tokenize.detect_encoding(f.readline)
         with open(file, encoding=encoding) as f:
         errprint("%s: I/O Error: %s" % (file, str(msg)))
+    newline = r.newlines
+    if isinstance(newline, tuple):
+        errprint("%s: mixed newlines detected; cannot process file" % file)
+        return
         if verbose:
                 shutil.copyfile(file, bak)
                 if verbose:
                     print("backed up", file, "to", bak)
-            with open(file, "w", encoding=encoding) as f:
+            with open(file, "w", encoding=encoding, newline=newline) as f:
             if verbose:
                 print("wrote new", file)
         # indeed, they're our headache!
         self.stats = []
+        # Save the newlines found in the file so they can be used to
+        #  create output without mutating the newlines.
+        self.newlines = f.newlines
     def run(self):
         tokens = tokenize.generate_tokens(self.getline)
         for _token in tokens: