I wrote the code below which inputs a number in hex format and outputs it in decimal form:-
#include<iostream>
#include<iomanip>
#include<stdint.h>
using namespace std;
int main()
{
uint8_t c;
cin>>hex>>c;
cout<<dec<<c;
//cout<<sizeof(c);
return 0;
}
But when I input c(hex for 12), the output was again c(and not 12). Can somebody explain?
This is because uint8_t is usually a typedef for unsigned char. So it's actually reading 'c' as ASCII 0x63.
Use int instead.
#include <iostream>
#include <iomanip>
using namespace std;
int main()
{
int c;
cin>>hex>>c;
cout<<dec<<c<<'\n';
return 0;
}
Program output:
$ g++ test.cpp $ ./a.out c 12
This is an unfortunate side effect of the fact that uint8_t is actually unsigned char. So when you store c, its storing the ASCII value of c (99 decimal), not the numeric value 12.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With