Skip to content

Commit fe85ab8

Browse files
committed
Treat UTF-16 strings in binary VDF as little-endian
Integers in binary VDF are already treated as little-endian (least significant byte first) regardless of CPU architecture, but the 16-bit units in UTF-16 didn't get the same treatment. This led to a test failure on big-endian machines. Resolves: ValvePython#33 Signed-off-by: Simon McVittie <[email protected]>
1 parent d762926 commit fe85ab8

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

vdf/__init__.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -361,7 +361,7 @@ def read_string(fp, wide=False):
361361
result = buf[:end]
362362

363363
if wide:
364-
result = result.decode('utf-16')
364+
result = result.decode('utf-16le')
365365
elif bytes is not str:
366366
result = result.decode('utf-8', 'replace')
367367
else:
@@ -469,7 +469,7 @@ def _binary_dump_gen(obj, level=0, alt_format=False):
469469
value = value.encode('utf-8') + BIN_NONE
470470
yield BIN_STRING
471471
except:
472-
value = value.encode('utf-16') + BIN_NONE*2
472+
value = value.encode('utf-16le') + BIN_NONE*2
473473
yield BIN_WIDESTRING
474474
yield key + BIN_NONE + value
475475
elif isinstance(value, float):

0 commit comments

Comments
 (0)