Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Class wrapper around file -- proper way to close file handle when no longer referenced

I've got a class that wraps some file handling functionality I need. Another class creates an instance of the filehandler and uses it for an indeterminate amount of time. Eventually, the caller is destroyed, which destroys the only reference to the filehandler.

What is the best way to have the filehandler close the file?

I currently use __del__(self) but after seeing several different questions and articles, I'm under the impression this is considered a bad thing.

class fileHandler:
    def __init__(self, dbf):
        self.logger = logging.getLogger('fileHandler')
        self.thefile = open(dbf, 'rb')
    def __del__(self):
        self.thefile.close()

That's the relevent bit of the handler. The whole point of the class is to abstract away details of working with the underlying file object, and also to avoid reading the entire file into memory unnecessarily. However, part of handling the underlying file is closing it when the object falls out of scope.

The caller is not supposed to know or care about the details involved in the filehandler. It is the filehandler's job to release any necessary resources involved when it falls out of scope. That's one of the reasons it was abstracted in the first place. So, I seem to be faced with moving the filehandler code into the calling object, or dealing with a leaky abstraction.

Thoughts?

like image 247
Spencer Rathbun Avatar asked Dec 20 '11 21:12

Spencer Rathbun


People also ask

What happens if we don't close a file using filename close () in your program?

If you write to a file without closing, the data won't make it to the target file. But after some surfing I got to know that Python automatically closes a file when the reference object of a file is reassigned to another file. It is a good practice to use the close() method to close a file.

What is the purpose of using close () on a file you have written to?

When you write to a file using write() system call, it doesn't writes immediately to the file. So after your operations, you have to call close() so that the buffer is flushed to the file and changes persist.

Does Python close files automatically?

Within the block of code opened by “with”, our file is open, and can be read from freely. However, once Python exits from the “with” block, the file is automatically closed.

Why closing a file is necessary in file?

You've learned why it's important to close files in Python. Because files are limited resources managed by the operating system, making sure files are closed after use will protect against hard-to-debug issues like running out of file handles or experiencing corrupted data.


2 Answers

__del__ is not, by itself, a bad thing. You just have to be extra careful to not create reference cycles in objects that have __del__ defined. If you do find yourself needing to create cycles (parent refers to child which refers back to parent) then you will want to use the weakref module.

So, __del__ is okay, just be wary of cylic references.

Garbage collection: The important point here is that when an object goes out of scope, it can be garbage collected, and in fact, it will be garbage collected... but when? There is no guarantee on the when, and different Python implementations have different characteristics in this area. So for managing resources, you are better off being explicit and either adding .close() on your filehandler or, if your usage is compatible, adding __enter__ and __exit__ methods.

The __enter__ and __exit__ methods are described here. One really nice thing about them is that __exit__ is called even when exceptions occur, so you can count or your resources being closed gracefully.

Your code, enhanced for __enter__/__exit__:

class fileHandler:
    def __init__(self, dbf):
        self.logger = logging.getLogger('fileHandler')
        self.thefilename = dbf
    def __enter__(self):
        self.thefile = open(self.thefilename, 'rb')
        return self
    def __exit__(self, *args):
        self.thefile.close()

Note that the file is being opened in __enter__ instead of __init__ -- this allows you to create the filehandler object once, and then use it whenever you need to in a with without recreating it:

fh = filehandler('some_dbf')
with fh:
    #file is now opened
    #do some stuff
#file is now closed
#blah blah
#need the file again, so
with fh:
    # file is open again, do some stuff with it
#etc, etc 
like image 138
Ethan Furman Avatar answered Oct 09 '22 23:10

Ethan Furman


As you've written it the class doesn't make the file close any more reliably. If you simple drop the filehandler instance on the floor then the file won't close until the object is destroyed. This might be immediately or might not be until the object is garbage collected, but just dropping a plain file object on the floor would close it just as quickly. If the only reference to thefile is from inside your class object then when filehandler is garbage collected thefile will be also be garbage collected and therefore closed at the same time.

The correct way to use files is to use the with statement:

with open(dbf, 'rb') as thefile:
    do_something_with(thefile)

that will guarantee that thefile is always closed whenever the with clause exits. If you want to wrap your file inside another object you can do that too by defining __enter__ and __exit__ methods:

class FileHandler:
    def __init__(self, dbf):
        self.logger = logging.getLogger('fileHandler')
        self.thefile = open(dbf, 'rb')
    def __enter__(self):
        return self
    def __exit__(self, exc_type, exc_value, traceback):
        self.thefile.close()

and then you can do:

with FileHandler(dbf) as fh:
    do_something_with(fh)

and be sure the file gets closed promptly.

like image 37
Duncan Avatar answered Oct 10 '22 00:10

Duncan