I made simple python program to generate big text file:
import sys
import random
f = open('data.txt', 'w')
for i in range(100000000):
f.write(str(i) + "\t" + str(random.randint(0,1000)) + "\n")
f.close()
When I launch it using CPython it eat all available OS memory and write nothing to the file.
When I launch it on Jython I get OutOfMemoryException.
As far as I understand it stores everything in memory buffer and never did flush before close() call.
My question is: how to limit the file buffer and trigger autoflush?
I don’t want to callflush() manually, I think it’s wrong from performance point of view. I want flush() to be called automatically when file buffer is overloaded or so.
Thanks!
Buffering is not the problem. The problem is calling the
range()function with a giant argument, which will attempt to allocate an array with lots of elements. You will get the same error if you just say