Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Create char array of integer using digits as size

I am trying to create a char array in C, to fill it with the digits of an int, but the int can be of any number of digits.

I'm using a created function called getDigits(int num), that returns a number of digits the int has.

char buffer[getDigits(number)] = "";
snprintf(buffer, sizeof(buffer),"%d",number);

but when I compile using gcc, it returns:

error: variable-sized object may not be initialized

I've tried everything. When I declare it as char fileSizeStr[5] = "";, it works. I can see the problem is rising when I try to declare the buffer size dynamically, but I would really like to know if is a way of achieving this.

like image 593
anairinac Avatar asked Jun 07 '13 00:06

anairinac


People also ask

How do you create an array of digits from a number?

Given a non-negative number represented as an array of digits, add 1 to the number ( increment the number represented by the digits ). The digits are stored such that the most significant digit is the first element of the array.

How do you convert a number to a character array?

Approach: The basic approach to do this, is to recursively find all the digits of N, and insert it into the required character array. Count total digits in the number. Declare a char array of size digits in the number. Separating integer into digits and accommodate it to a character array.

Can we store integer in char array?

You can add the ASCII value of 0 to the value you want to store digit into the character array. For example if you want to store 0,1,...,9 into the character array A[10], you may use the following code. This works only if the integer you want to store is less than 10.

Can we store integer in char array in Java?

We can convert int to char in java using typecasting. To convert higher data type into lower, we need to perform typecasting. Here, the ASCII character of integer value will be stored in the char variable.


1 Answers

The problem is exactly as your compiler is telling you; you're not allowed to initialise VLAs. Zack gave an obvious solution in the comments: Remove the initialisation. You'll find working examples in this answer, some of which do permit an initialisation, and others which don't. You'll find more information about that in comments. The following examples are ordered from most sensible (IMHO) to least sensible (which involve using malloc) for allocating storage for decimal digit sequences representing numbers.


I suggest using the same trick to determine how many bytes are necessary to store an int value as decimal digits as you'd use for octal: Divide the total number of bits in an int by 3 and add for any sign and NUL termination. digit_count could be written as a preprocessor macro like so:

#include <limits.h>
#include <stddef.h>
#include <stdio.h>

#define digit_count(num) (1                                /* sign            */ \
                        + sizeof (num) * CHAR_BIT / 3      /* digits          */ \
                        + (sizeof (num) * CHAR_BIT % 3 > 0)/* remaining digit */ \
                        + 1)                               /* NUL terminator  */

int main(void) {
    short short_number = -32767;
    int int_number = 32767;
    char short_buffer[digit_count(short_number)] = { 0 }; /* initialisation permitted here */
    char int_buffer[digit_count(int_number)];
    sprintf(short_buffer, "%d", short_number);
    sprintf(int_buffer, "%d", int_number);
}

As you can see, one powerful benefit here is that digit_count can be used for any type of integer without modification: char, short, int, long, long long, and the corresponding unsigned types.

One minor downside by comparison is that you waste a few bytes of storage, particularly for small values like 1. In many cases, the simplicity of this solution more than makes up for this; The code required to count the decimal digits at runtime will occupy more space in memory than is wasted here.


If you're prepared to throw away the simplicity and generic qualities of the above code, and you really want to count the number of decimal digits, Zacks advice applies: Remove the initialisation. Here's an example:

#include <stddef.h>
#include <stdio.h>

size_t digit_count(int num) {
    return snprintf(NULL, 0, "%d", num) + 1;
}

int main(void) {
    int number = 32767;
    char buffer[digit_count(number)]; /* Erroneous initialisation removed as per Zacks advice */
    sprintf(buffer, "%d", number);
}

In response to the malloc recommendations: The least horrible way to solve this problem is to avoid unnecessary code (eg. calls to malloc and later on free). If you don't have to return the object from a function, then don't use malloc! Otherwise, consider storing into a buffer provided by the caller (via arguments) so that the caller can choose which type of storage to use. It's very rare that this isn't an appropriate alternative to using malloc.

If you do decide to use malloc and free for this, however, do it the least horrible way. Avoid typecasts on the return value of malloc and multiplications by sizeof (char) (which is always 1). The following code is an example. Use either of the above methods to calculate the length:

char *buffer = malloc(digit_count(number)); /* Initialisation of malloc bytes not possible */
sprintf(buffer, "%d", number);

... and don't forget to free(buffer); when you're done with it.

like image 194
autistic Avatar answered Oct 26 '22 15:10

autistic