I am using python’s csv module to extract data from a csv that is constantly being updated by an external tool. I have run into a problem where when I reach the end of the file I get a StopIteration error, however, I would like the script to continue to loop waiting for more lines to be added by the external tool.
What I came up with so far to do this is:
f = open('file.csv')
csvReader = csv.reader(f, delimiter=',')
while 1:
try:
doStuff(csvReader.next())
except StopIteration:
depth = f.tell()
f.close()
f = open('file.csv')
f.seek(depth)
csvReader = csv.reader(f, delimiter=',')
This has the intended functionality but it also seems terrible. Looping after catching the StopIteration is not possible since once StopIteration is thrown, it will throw a StopIteration on every subsequent call to next(). Anyone have any suggestions on how to implement this is in such a way that I don’t have to do this silly tell and seeking? Or have a different python module that can easily support this functionality.
Your problem is not with the CSV reader, but with the file object itself. You may still have to do the crazy gyrations you’re doing in your snippet above, but it would be better to create a file object wrapper or subclass that does it for you, and use that with your CSV reader. That keeps the complexity isolated from your csv processing code.
For instance (warning: untested code):
Then your main code becomes simpler, as it is freed from having to manage the file reopening (note that you also don’t have to restart your csv_reader whenever the file restarts: