I know that max_size()
from std::allocator
is a theoretical limit, but I just tried something and the numbers that I get are just huge; so how I should interpret this numbers and what is the philosophy behind this function ?
my example:
#include <iostream>
#include <memory>
#include <cstdint>
typedef int8_t Tnum_1;
struct X
{
template <typename T>
static void printAllocInfo()
{
std::cout << std::allocator<T>().max_size() << "\t Alloc max_size\n";
std::cout << ((std::allocator<T>().max_size()) / (1024))
<< "\t Alloc max_size / 1024\n";
std::cout << (((std::allocator<T>().max_size()) / (1024)) / (1024))
<< "\t\t Alloc max_size / 1024 / 1024\n";
}
};
int main()
{
X::printAllocInfo<Tnum_1>();
return (0);
}
it prints:
18446744073709551615 Alloc max_size
18014398509481983 Alloc max_size / 1024
17592186044415 Alloc max_size / 1024 / 1024
and considering that I just used int8_t
as the template argument, this means that I can allocate 18446744073709551615 bytes
on my machine ? This number is so big that it starts to loose any practical meaning.
That number you quote is of course 2^64 -1, a.k.a the largest 64-bit unsigned integer that can be represented. All max_size()
indicates that it doesn't support anything larger, not that it supports everyhthing below that size.
It is up to your OS to actually provide this to your program. But since you can compile a program on a different machine that you run it on (including a machine in the distant future), there is no reason to limit max_size()
to smaller numbers.
NOTE: if you read Table 31 — Allocator requirements in the C++ Standard, you find the following for an allocator object a
of type X
:
a.allocate(n)
: Memory is allocated forn
objects of typeT
but objects are not constructed.
a.max_size()
: the largest value that can meaningfully be passed toX::allocate()
So max_size
is the largest number of objects, not bytes, that can be allocated.
The previous answers provided are incomplete since they only mention number of bytes, but not number of objects.
The documentation for max_size says the function should return "the maximum theoretically possible value of n, for which the call allocate(n, 0) could succeed," where n is the number of objects. It also says that for some allocator implementations it should return
std::numeric_limits<size_type>::max() / sizeof(value_type)
rather than
std::numeric_limits<size_type>::max()
The STL containers (e.g. - std::vector
, std::map
, or std::list
) use max_size to calculate container size in terms of object count, not byte count. Therefore, max_size()
should not return the number of bytes available on the operating system, but use the number of available bytes to calculate the number of objects the allocator can hold.
If you wrote an allocator class for STL containers, you could implement the max_size()
function like this to provide an accurate object count instead of over-estimating by using std::numeric_limits<size_type>::max()
.
size_type max_size() const
{
const unsigned long long bytesAvailable = GetTotalAvailableMemory();
const unsigned long long maxPossibleObjects = bytesAvailable / sizeof(value_type);
return maxPossibleObjects;
}
You can implement the GetTotalAvailableMemory()
like these functions depending on your operating system. Either will return the number of unallocated bytes which a program's process may use.
#if defined(unix) || defined(__unix__) || defined(__unix)
#include <unistd.h>
unsigned long long GetTotalAvailableMemory()
{
const long pageCount = sysconf( _SC_PHYS_PAGES );
const long pageSize = sysconf( _SC_PAGE_SIZE );
const unsigned long long totalBytes = pageCount * pageSize;
return totalBytes;
}
#endif
#if defined(_WIN64) || defined(_WIN64)
#include <windows.h>
unsigned long long GetTotalAvailableMemory()
{
MEMORYSTATUSEX status;
status.dwLength = sizeof( status );
GlobalMemoryStatusEx( &status );
return status.ullAvailVirtual;
}
#endif
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With