Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Which algorithm does CreditCardAttribute use for credit card number format validation

.NET 4.5 includes a new validation attribute named as CreditCardAttribute and this attribute specifies that a data field value is a credit card number. When I decompile the assembly which contains this class, I can see the following code for the credit card number validation:

public override bool IsValid(object value)
{
  if (value == null)
  {
    return true;
  }
  string text = value as string;
  if (text == null)
  {
    return false;
  }
  text = text.Replace("-", "");
  text = text.Replace(" ", "");
  int num = 0;
  bool flag = false;
  foreach (char current in text.Reverse<char>())
  {
    if (current < '0' || current > '9')
    {
      return false;
    }
    int i = (int)((current - '0') * (flag ? '\u0002' : '\u0001'));
    flag = !flag;
    while (i > 0)
    {
      num += i % 10;
      i /= 10;
    }
  }
  return num % 10 == 0;
}

Does anybody know which algorithm is applied here to validate the number format? Luhn's algorithm? Also, is this an ISO standard? Finally, do you think that this is the right and 100% correct implementation?

MSDN doesn't provide much information about this. In fact, they have the wrong information as below:

Remarks

The value is validated using a regular expression. The class does not validate that the credit card number is valid for purchases, only that it is well formed.

like image 816
tugberk Avatar asked Jan 15 '23 11:01

tugberk


1 Answers

The last line:

return num % 10 == 0;

Is a very strong hint that this is a Luhn Algorithm

like image 153
Jamiec Avatar answered Jan 18 '23 00:01

Jamiec