Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why did C++ introduce duration_cast instead of using static_cast?

I was looking at some code that used duration_cast. Looking a it I wondered why a static_cast was not used since static_cast's purpose in life is to convert between types.

Why did C++ need a new operator to convert between times? Why was static_cast not used?


Maybe I don't appreciate the difference C++ is making between milli-seconds, micro-seconds, nano-seconds, etc. For some reason, I thought the answer would be obvious or discussed on Stack Overflow, but I have not found it (yet).

like image 205
jww Avatar asked May 31 '17 00:05

jww


2 Answers

There is already direct conversion of time intervals when there is no risk of loss of precision. duration_cast is required when there is a risk of loss of precision.

duration_cast is therefore not so much an operator as a deliberate conversion.

static_cast is not suitable since different duration types are not related. They are entirely different classes which happen to support the same concept.

e.g.:

#include <chrono>

int main()
{
  using namespace std::literals;

  // milliseconds    
  auto a = 10ms;

  // this requires a duration-cast
  auto lossy = std::chrono::duration_cast<std::chrono::seconds>(a);

  // but this does not
  auto not_lossy = std::chrono::nanoseconds(a);
}
like image 186
Richard Hodges Avatar answered Sep 25 '22 07:09

Richard Hodges


I have revisited that question a lot over the years, and I now think that may have been a design mistake on my part.

I am currently experimenting with depending more on explicit conversion syntax for conversions that should not be made implicitly, rather than "named conversion syntax".

For example:

https://howardhinnant.github.io/date/date.html#year

year y = 2017_y;
int iy = int{y};  // instead of iy = y.to_int()
like image 40
Howard Hinnant Avatar answered Sep 22 '22 07:09

Howard Hinnant