Assume we have a drawing program with different elements, such as circle, rectangle, triangle and so on. Different kinds of objects that will all need similar function such as draw()
to display themselves.
I wonder how would a programmer approach the problem that is nowadays typically solved by polymorphism, i.e. go through a collection of non-identical elements and invoke common functionality across the different objects.
One way that comes to mind is to have a struct with a function pointer to the appropriate function (or index in a function pointer array) as well as a void pointer to the actual instance, and pass the pointer which is cast to the proper type in the function. But that is just how I - a guy who is clueless on the subject would do it.
I do realize this might be a noobish question, but since I haven't been around in the "olden" days, I really wonder how was this problem tackled. What kind of approach was used in procedural programming and did it have a performance benefit, since we all do know polymorphism has an overhead even in fast languages like C++, due to the virtual method lookup.
A really simple example.
If this interest you you can find more of this in the Linux Kernel.
#include <stdio.h>
struct shape {
void (*say_hello)(void);
};
void circle_say_hello(void)
{
printf("Hi I am circle!\n");
}
void square_say_hello(void)
{
printf("Meh I am square.\n");
}
#define ARRAY_SIZE(a) (sizeof(a)/sizeof(a[0]))
int main(int argc, char *argv[])
{
struct shape circle = { .say_hello = circle_say_hello, };
struct shape square = { .say_hello = square_say_hello, };
struct shape* shapes[] = {&circle, &square};
int i;
for (i = 0; i < ARRAY_SIZE(shapes); i++) {
if (shapes[i] && shapes[i]->say_hello)
shapes[i]->say_hello();
}
return 0;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With