I really dont understand this.Maybe someone can explain me . I wanted find out how many elements is in dataInput. I'm programming in C
void Calibrate_data(float* dataInput)
{
int dataSize = sizeof(*dataInput)/sizeof(float);
printf("%i and %i",sizeof(*dataInput)/sizeof(float),dataSize);
}
The output is:
1 and 10
This is the wrong format specifier here:
printf("%i and %i",sizeof(*dataInput)/sizeof(float),dataSize);
^^
sizeof
returns size_t
type which is unsigned the correct format specifier is %zu
or %Iu
in Visual Studio.
Using the wrong format specifier invokes undefined behavior but that does not seem to explain the output of 10
for dataSize
which does not make sense since sizeof(*dataInput)
will be the size of a float
. So we would expect sizeof(*dataInput)/sizeof(float)
to be 1
, as Macattack said an SSCCE should help resolve that output.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With