Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

For NUMBER columns in Oracle, does specifying a length help performance?

Tags:

sql

oracle

Is there any true difference in performance between NUMBER, NUMBER(3) and NUMBER(10)? Doesn't the RDBMS use 32 bits to store the value regardless, and simply limit the length of the data when looked at as a string?

A colleague debated that it was, for example, more efficient to use NUMBER(10) vs NUMBER/NUMBER(11), thinking that n was the number of bytes rather than the length of the data in characters. Normally I'd agree to limit the column size if the number of rows was guaranteed not to exceed 10^n or so, but for this particular database the limit for number of rows was literally "as many as possible", e.g. 2^32 or ~4 billion, even though we'd never reach that amount. Coming up with a "lowest maximum" for this situation seems pointless and a waste of time and I thought using NUMBER without specifying a length would be simpler and incur no penalties. Am I correct?

like image 358
Jake Petroules Avatar asked Nov 15 '10 23:11

Jake Petroules


1 Answers

Technically, you don't define a length, but a precision and scale.

The same numeric value (eg 100, 4.3) takes the same internal value irrespective of the precision and scale defining the column.

The maximum value a column can hold is determined by BOTH precision and scale. That is you can store the value 100 in a column of NUMBER(3,0) but not in a column of NUMBER(3,1).

Generally, if a column shouldn't store a decimal the scale should be zero. Bear in mind that if you try to store 10.12 in a NUMBER(3,0) it will store the value 10. It doesn't error (because if it did, you'd have a hard time storing the value of one third in anything). If you want it to error you want to allow for a higher scale and use a constraint to stop it ever being used.

I also believe that you should try to use a 'sensible' precision too. If you stored a thousand values each second for a thousand years you'd still fit that within 14 digits. If I see a column of NUMBER(14,0) then I know that on my screen or printout I should allow for 14 numeric characters to be displayed. If I see just NUMBER or NUMBER(38,0), then I haven't really been given any guidance and may guess at 14 characters. And if they start putting 16 character credit card numbers in there, I'll be wrong. Which is why I prefer not to guess.

Also in PL/SQL, they have a PLS_INTEGER datatype which goes up to 2147483647. If I see a column of NUMBER(9,0) I know I can fit it into a PLS_INTEGER. There's probably similar considerations for Java or .Net etc. when they want to work out what datatype, scale, precision to use when pulling/pushing data to the database.

like image 119
Gary Myers Avatar answered Sep 28 '22 11:09

Gary Myers