Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Is calling destructor manually always a sign of bad design?

I was thinking: they say if you're calling destructor manually - you're doing something wrong. But is it always the case? Are there any counter-examples? Situations where it is neccessary to call it manually or where it is hard/impossible/impractical to avoid it?

like image 794
Violet Giraffe Avatar asked Jan 06 '13 21:01

Violet Giraffe


People also ask

Can you call destructor manually?

Yes, it is possible to call special member functions explicitly by the programmer.

Are destructors automatically called?

A destructor is a member function that is invoked automatically when the object goes out of scope or is explicitly destroyed by a call to delete .

What happens when a destructor is called?

A destructor is called for a class object when that object passes out of scope or is explicitly deleted. A destructor is a member function with the same name as its class prefixed by a ~ (tilde). For example: class X { public: // Constructor for class X X(); // Destructor for class X ~X(); };

What happens when destructor is not called?

It is automatically called when an object is destroyed, either because its scope of existence has finished (for example, if it was defined as a local object within a function and the function ends) or because it is an object dynamically assigned and it is released using the operator delete.


1 Answers

All answers describe specific cases, but there is a general answer:

You call the dtor explicitly every time you need to just destroy the object (in C++ sense) without releasing the memory the object resides in.

This typically happens in all the situation where memory allocation / deallocation is managed independently from object construction / destruction. In those cases construction happens via placement new upon an existent chunk of memory, and destruction happens via explicit dtor call.

Here is the raw example:

{   char buffer[sizeof(MyClass)];    {      MyClass* p = new(buffer)MyClass;      p->dosomething();      p->~MyClass();   }   {      MyClass* p = new(buffer)MyClass;      p->dosomething();      p->~MyClass();   }  } 

Another notable example is the default std::allocator when used by std::vector: elements are constructed in vector during push_back, but the memory is allocated in chunks, so it pre-exist the element contruction. And hence, vector::erase must destroy the elements, but not necessarily it deallocates the memory (especially if new push_back have to happen soon...).

It is "bad design" in strict OOP sense (you should manage objects, not memory: the fact objects require memory is an "incident"), it is "good design" in "low level programming", or in cases where memory is not taken from the "free store" the default operator new buys in.

It is bad design if it happens randomly around the code, it is good design if it happens locally to classes specifically designed for that purpose.

like image 180
Emilio Garavaglia Avatar answered Sep 22 '22 08:09

Emilio Garavaglia