- edited description
UTF 8 -Bom json (windows) can not import (test with Pose files)
I found some vendor pose (and only about some files) , when I import by plug-in
it return erroer,
JSON error while reading ascii file
"I….\Genesis 3 Female\Poses\i13\Foxy\01 Full Body\i13Foxy 01.duf"
Unexpected UTF-8 BOM (decode using utf-8-sig): line 1 column 1 (char 0)
For details see
'C:\Users\TAKE\Documents\daz_importer_errors.txt'
I understand, only I need is un-zip duf (if zipped), then re-save with text editor.
actually it worked for me.
but I do not like to serch and re-save as utf-8 (non BOM), (half of this product seems saved as
utf-8 BOM ^^;)
then python document say,
when open file, encoding change “utf_8” to “utf_8_sig may work for both,,
io.opne(filename, "r", encoding="utf_8_sig")
Though at current I do not test it,, but I may test with my scripts,, but if you can confirm it,, hope you change those codes, in daz importer, which may import json.
(if you already set so,, I do not know if it is bpy side problem or not)
======
yes I could confirm, once save with BOM, then it cause erroer, when I load json by bpy.
but if I change coding as encoding="utf_8_sig" (it may need only when you load files, with io.open()
it can load both json (with BOM or pure utf-8)
Comments (10)
-
reporter -
reporter I could fix temporally..
fileutils.py change line 54, in def safeopen()
fp = open(filepath, rw, encoding="utf_8_sig")
have worked for me (can import both)
And I do not know if I need,
loadJson.py line 47 , in def loadJson(filepath, mustOpen=False):
string = bytes.decode("utf_8_sig") .
as I said, I do not hope to save with bom, so I may not touch
def save json()
-
I am not a python expert so I don’t know if python is supposed to support unicode other than for comments. I know for sure that the zip format doesn’t support unicode file names. So for example if a japanese PA publish an asset using japanese characters it will work fine only for a japanese windows installation and it will not work in another country.
I myself had issues with some japanese assets got outside the daz store that couldn’t install. What I mean is the PAs need to avoid non-ascii files if they want to publish international.
-
reporter It is not about file name. there is no japanese about those file data, and file-name.
When you save utf-8 with windows old editor, they added Bom for UTF-8 as default.
Which UTF-8 format may work about UTF-8 (with BOM or without BOM) change with each aprication. sometimes, I need to save file with BOM sometimes not.but python can handle both.
and we know , we should not use japanese for filename or data without the aprication offered for japanese. If you see japanese file in web, they do not think to offer for publish international.
-
repo owner This is the first time I have heard about utf-8-sig, but from what I can see in the python docs it sometimes helps and almost never hurts. Changed in latest commit.
-
reporter Thanks, It is daz official product, (ironman old pose products)
so it may happen for most of user who bought his product (but only for old products , some files)
Then it may need to check by mac and rinax user though,,,) too, actually it not cause harm,,
(I suppose so)
Then at current , about loadjson , I keep it as same as before.
loadJson.py line 47 , in def loadJson(filepath, mustOpen=False):
string = bytes.decode("utf_8")
(I do not know if it need to change utf_8_sig, or not )
-
reporter - attached plevis_minbend25.duf
-
reporter No. I test up-date with your commits,
but we need to change
fileutils.py change line 54, in def safeopen()
fp = open(filepath, rw, encoding="utf_8_sig")
too.
(you dd not edit it, about recent commit)
test please atached pose which save with UTF-8 BOM
-
repo owner OK, now I think that I have tracked down all places which specify the encoding. The plugin now uses utf_8_sig when reading and plain utf_8 when writing. Your test pose loads fine.
-
reporter - changed status to resolved
Thanks it should work now.
- Log in to comment