Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to define NULL using #define

Tags:

c++

c

null

I want to redefine NULL in my program such as

#define MYNULL ((void*)0)

But this definition is not working in the following statement:

char *ch = MYNULL;

Error : can not convert from void* to char *

What would be the best way to define NULL?

like image 350
cppdev Avatar asked Feb 24 '10 11:02

cppdev


People also ask

How do you use null?

The basic rule is simple: null should only be allowed when it makes sense for an object reference to have 'no value associated with it'. (Note: an object reference can be a variable, constant, property (class field), input/output argument, and so on.)

How do you define a null in Python?

null is often defined to be 0 in those languages, but null in Python is different. Python uses the keyword None to define null objects and variables. While None does serve some of the same purposes as null in other languages, it's another beast entirely.

How do you define null in JavaScript?

In JavaScript, null is a special value that represents an empty or unknown value. For example, let number = null; The code above suggests that the number variable is empty at the moment and may have a value later.


1 Answers

Don't do this. There is nothing that says that NULL has to be the value zero, it's implementation specific.

It could be a value that represents the end of memory, some special place in memory, or even an object that represents no value exists.

Doing this is very dangerous, may break portability, and will most certainly screw with code-aware editors. It isn't buying you anything, trust your library's definition.

EDIT: Evan is correct! The code itself will say zero, under the hood the compiler can do what it wants with implementation specific details. Thanks Evan!

like image 159
Walt Stoneburner Avatar answered Nov 11 '22 00:11

Walt Stoneburner