Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

c# sizeof decimal?

Tags:

c#

Unclear on the sizeof for decimal types. Does the size in bytes vary by precision as in sql server? Is the precision variable for the c# type 'decimal'?

I don't want to turn on unsafe code to just call sizeof on a decimal type. How would you approach this?

like image 400
P a u l Avatar asked Apr 13 '09 20:04

P a u l


People also ask

What C is used for?

C programming language is a machine-independent programming language that is mainly used to create many types of applications and operating systems such as Windows, and other complicated programs such as the Oracle database, Git, Python interpreter, and games and is considered a programming foundation in the process of ...

Is C language easy?

Compared to other languages—like Java, PHP, or C#—C is a relatively simple language to learn for anyone just starting to learn computer programming because of its limited number of keywords.

What is C in C language?

What is C? C is a general-purpose programming language created by Dennis Ritchie at the Bell Laboratories in 1972. It is a very popular language, despite being old. C is strongly associated with UNIX, as it was developed to write the UNIX operating system.

What is C full form?

History: The name C is derived from an earlier programming language called BCPL (Basic Combined Programming Language). BCPL had another language based on it called B: the first letter in BCPL.


2 Answers

The decimal keyword indicates a 128-bit data type.

Source: MSDN

like image 160
Tormod Fjeldskår Avatar answered Oct 03 '22 13:10

Tormod Fjeldskår


As others have said, decimal is always 16 bytes (128 bits). The precision of decimal is always 28/29 digits. It's a floating point type, unlike SQL's DECIMAL type. See my article on it for more details.

like image 41
Jon Skeet Avatar answered Oct 03 '22 11:10

Jon Skeet