I’ve occasionally encountered software – including compilers – that refuse to accept or properly handle text files that aren’t properly terminated with a newline. I’ve even encountered explicit errors of the form,
no newline at the end of the file
…which would seem to indicate that they’re explicitly checking for this case and then rejecting it just to be stubborn.
Am I missing something here? Why would – or should – anything care whether or not a file ends with a seemingly-superfluous bit of whitespace?
Historically, at least in the Unix world, “newline” or rather U+000A Line Feed was a line terminator. This stands in stark contrast to the practice in Windows for example, where CR+LF is a line separator.
A naïve solution of reading every line in a file would be to append characters to a buffer until an LF was encountered. If done really stupid this would ignore the last line in a file if it wasn’t terminated by LF.
Another thing to consider are macro systems that allow including files. A line such as
might be replaced by the contents of the mentioned file where, if the last line wasn’t ended with an LF, it would get merged with the next line. And yes, I’ve seen this behavior with a particular macro assembler for an embedded platform.
Nowadays I’m in the firm belief that (a) it’s a relic of ancient times and (b) I haven’t seen modern software that can’t handle it but yet we still carry around numerous editors on Unix-like systems who helpfully put a byte more than needed at the end of a file.