Skip to content

Commit

Permalink
Merge pull request #93 from artemisart/fix-test
Browse files Browse the repository at this point in the history
fix tokenization_test
  • Loading branch information
jacobdevlin-google committed Nov 9, 2018
2 parents 3c67c1d + ff7d05a commit 35deae7
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion tokenization_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ def test_full_tokenizer(self):
"[UNK]", "[CLS]", "[SEP]", "want", "##want", "##ed", "wa", "un", "runn",
"##ing", ","
]
with tempfile.NamedTemporaryFile(delete=False) as vocab_writer:
with tempfile.NamedTemporaryFile(mode='w+', delete=False) as vocab_writer:
vocab_writer.write("".join([x + "\n" for x in vocab_tokens]))

vocab_file = vocab_writer.name
Expand Down

0 comments on commit 35deae7

Please sign in to comment.