Is there any API in c++ for getting the size of a specified folder?
If not, how can I get the total size of a folder including all subfolders and files?
Go to Windows Explorer and right-click on the file, folder or drive that you're investigating. From the menu that appears, go to Properties. This will show you the total file/drive size. A folder will show you the size in writing, a drive will show you a pie chart to make it easier to see.
How to view the file size of a directory. To view the file size of a directory pass the -s option to the du command followed by the folder. This will print a grand total size for the folder to standard output. Along with the -h option a human readable format is possible.
Using the ls Command –h – scales file sizes and directory sizes into KB, MB, GB, or TB when the file or directory size is larger than 1024 bytes. –s – displays a list of the files and directories and shows the sizes in blocks.
How about letting OS do it for you:
long long int getFolderSize(string path)
{
// command to be executed
std::string cmd("du -sb ");
cmd.append(path);
cmd.append(" | cut -f1 2>&1");
// execute above command and get the output
FILE *stream = popen(cmd.c_str(), "r");
if (stream) {
const int max_size = 256;
char readbuf[max_size];
if (fgets(readbuf, max_size, stream) != NULL) {
return atoll(readbuf);
}
pclose(stream);
}
// return error val
return -1;
}
Actually I don't want to use any third party library. Just want to implement in pure c++.
If you use MSVC++ you have <filesystem>
"as standard C++".
But using boost or MSVC - both are "pure C++".
If you don’t want to use boost, and only the C++ std:: library this answer is somewhat close. As you can see here, there is a Filesystem Library Proposal (Revision 4). Here you can read:
The Boost version of the library has been in widespread use for ten years. The Dinkumware version of the library, based on N1975 (equivalent to version 2 of the Boost library), ships with Microsoft Visual C++ 2012.
To illustrate the use, I adapted the answer of @Nayana Adassuriya , with very minor modifications (OK, he forgot to initialize one variable, and I use unsigned long long
, and most important was to use: path filePath(complete (dirIte->path(), folderPath));
to restore the complete path before the call to other functions). I have tested and it work well in windows 7.
#include <iostream>
#include <string>
#include <filesystem>
using namespace std;
using namespace std::tr2::sys;
void getFoldersize(string rootFolder,unsigned long long & f_size)
{
path folderPath(rootFolder);
if (exists(folderPath))
{
directory_iterator end_itr;
for (directory_iterator dirIte(rootFolder); dirIte != end_itr; ++dirIte )
{
path filePath(complete (dirIte->path(), folderPath));
try{
if (!is_directory(dirIte->status()) )
{
f_size = f_size + file_size(filePath);
}else
{
getFoldersize(filePath,f_size);
}
}catch(exception& e){ cout << e.what() << endl; }
}
}
}
int main()
{
unsigned long long f_size=0;
getFoldersize("C:\\Silvio",f_size);
cout << f_size << endl;
system("pause");
return 0;
}
You may use boost in this way. You can try to optimize it some deeper.
#include <iostream>
#include <string>
#include <boost/filesystem.hpp>
#include <boost/filesystem/operations.hpp>
#include <boost/algorithm/string.hpp>
using namespace std;
namespace bsfs = boost::filesystem;
void getFoldersize(string rootFolder,long & file_size){
boost::replace_all(rootFolder, "\\\\", "\\");
bsfs::path folderPath(rootFolder);
if (bsfs::exists(folderPath)){
bsfs::directory_iterator end_itr;
for (bsfs::directory_iterator dirIte(rootFolder); dirIte != end_itr; ++dirIte )
{
bsfs::path filePath(dirIte->path());
try{
if (!bsfs::is_directory(dirIte->status()) )
{
file_size = file_size + bsfs::file_size(filePath);
}else{
getFoldersize(filePath.string(),file_size);
}
}catch(exception& e){
cout << e.what() << endl;
}
}
}
}
int main(){
long file_size =0;
getFoldersize("C:\\logs",file_size);
cout << file_size << endl;
system("pause");
return 0;
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With