I'm trying to get a simple OpenCV sample working in C++ on Windows and my C++ is more than rusty.
The sample is fairly simple:
#include <opencv2/core/core.hpp>
#include <opencv2/highgui/highgui.hpp>
#include <iostream>
using namespace cv;
using namespace std;
int main( int argc, char** argv )
{
if( argc != 2)
{
cout <<" Usage: display_image ImageToLoadAndDisplay" << endl;
return -1;
}
Mat image;
image = imread(argv[1], IMREAD_COLOR); // Read the file
if(! image.data ) // Check for invalid input
{
cout << "Could not open or find the image" << std::endl ;
return -1;
}
namedWindow( "Display window", WINDOW_AUTOSIZE ); // Create a window for display.
imshow( "Display window", image ); // Show our image inside it.
waitKey(0); // Wait for a keystroke in the window
return 0;
}
When I make a new simple C++ console application (with ATL) in Visual Studio 2012 I get a different template for main
:
int _tmain( int argc, _TCHAR* argv[] )
So before I send the filename to OpenCV's imread
function I need to convert the _TCHAR*
arg[1]
to a char*
. Using a simple filename, 'opencv-logo.jpg', from the memory in the memory window I can see that the _TCHAR are taking two bytes each
o.p.e.n.c.v.-.l.o.g.o...j.p.g...
6f 00 70 00 65 00 6e 00 63 00 76 00 2d 00 6c 00 6f 00 67 00 6f 00 2e 00 6a 00 70 00 67 00 00 00
Following the conversion recommendation in another answer I am trying to use ATL 7.0 String Conversion Classes and Macros by inserting the following code:
char* filename = CT2A(argv[1]);
But the resulting memory is a mess, certainly not 'opencv-logo.jpg' as an ascii string:
fe fe fe fe fe fe fe fe fe fe ...
þþþþþþþþþþ ...
Which conversion technique, function, or macro should I be using?
(N.B. This may be a related question but I cannot see how to apply the answer here.)
The quickest solution is to just change the signature to the standard one. Replace:
int _tmain( int argc, _TCHAR* argv[] )
With
int main( int argc, char *argv[] )
This does mean on Windows that the command line arguments get converted to the system's locale encoding and since Windows doesn't support UTF-8 here not everything converts correctly. However unless you actually need internationalization then it may not be worth your time to do anything more.
_TCHAR
, i.e. TCHAR
is a type that depends on your project's settings. It can be either wchar_t
(when you use Unicode) or char
(when you use Multi-byte). You will find this in Project Properties - General, there's a setting Character Set.
Probably the simplest thing that you could do is just to use multi-byte option and treat _TCHAR*
type as a simple char*
and use it to construct std::string
object ASAP:
std::string filename(argv[1]);
But in case you are going to work with special characters A LOT, then I find it more reasonable to use Unicode and hold strings in form of std::wstring
objects wherever it's possible. If that's the case, then just use std::wstring
's constructor instead:
std::wstring filename(argv[1]);
And in case you'll end up working with wide strings, sometimes you'll need a conversion between wide strings and multi-byte strings and these helpers might help you:
// multi byte to wide char:
std::wstring s2ws(const std::string& str)
{
int size_needed = MultiByteToWideChar(CP_UTF8, 0, &str[0], (int)str.size(), NULL, 0);
std::wstring wstrTo(size_needed, 0);
MultiByteToWideChar(CP_UTF8, 0, &str[0], (int)str.size(), &wstrTo[0], size_needed);
return wstrTo;
}
// wide char to multi byte:
std::string ws2s(const std::wstring& wstr)
{
int size_needed = WideCharToMultiByte(CP_ACP, 0, wstr.c_str(), int(wstr.length() + 1), 0, 0, 0, 0);
std::string strTo(size_needed, 0);
WideCharToMultiByte(CP_ACP, 0, wstr.c_str(), int(wstr.length() + 1), &strTo[0], size_needed, 0, 0);
return strTo;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With