I'm using the latest release of pupynere 1.0.15 and python 2.7.5.
I've encountered a truly strange event--your software crashes when reading a netCDF file.
Here is the string of error messages I get:
Traceback (most recent call last): File "parGlobalMoments.py", line 143, in <module> myFileHandles.append(pync.netcdf_file(inFiles[m], 'r')) File "/usr/lib64/python2.7/site-packages/pupynere.py", line 167, in init self._read() File "/usr/lib64/python2.7/site-packages/pupynere.py", line 397, in _read self._read_gatt_array() File "/usr/lib64/python2.7/site-packages/pupynere.py", line 415, in _read_gatt_array for k, v in self._read_att_array().items(): File "/usr/lib64/python2.7/site-packages/pupynere.py", line 426, in _read_att_array attributes[name] = self._read_values() File "/usr/lib64/python2.7/site-packages/pupynere.py", line 538, in _read_values values = values.rstrip('\x00').decode('utf-8') File "/usr/lib64/python2.7/encodings/utf_8.py", line 16, in decode return codecs.utf_8_decode(input, errors, True) UnicodeDecodeError: 'utf8' codec can't decode byte 0xd0 in position 0: invalid continuation byte
What's really strange is that this happens for a community dataset that is widely used: the NCEP-1 reanalyses, which span the period 1948-present. For the most part reading this data works just fine for the years 1948-2006 and 2010-13. The years 2007, 2008, and 2009 produce this fault for multiple fields in their daily-averaged surface reanalyses.
I can of course send you a code that isolates the problem better if you wish. I'm just posting this message as a heads-up that I've found this issue and in hope that you may be able to offer a quick fix based on the error message. Or, perhaps offer me corrective advice on how I may be using your software incorrectly to trigger the error, and can alter my use pattern to avoid this error.
Attached is the code that I am using. You can get some of the files that work with it/excite the bug at this URL:
The data filenames have the naming convention fieldname.gauss.year.nc ; each file contains one years' worth of globally gridded data, sampled daily. I've looked at files tmin.2m.year.nc and tmax.2m.year.nc and found the behavior reported above.
Thanks in advance for any help or advice you can offer.
All the Best, Jay