Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

what is uninitialized data in pytorch.empty function

i was going through pytorch tutorial and came across pytorch.empty function. it was mentioned that empty can be used for uninitialized data. But, when i printed it, i got a value. what is the difference between this and pytorch.rand which also generates data(i know that rand generates between 0 and 1). Below is the code i tried

a = torch.empty(3,4)
print(a)

Output:

tensor([[ 8.4135e-38,  0.0000e+00,  6.2579e-41,  5.4592e-39],
        [-5.6345e-08,  2.5353e+30,  5.0447e-44,  1.7020e-41],
        [ 1.4000e-38,  5.7697e-05,  2.5353e+30,  2.1580e-43]])
b = torch.rand(3,4)
print(b)

Output:

tensor([[ 0.1514,  0.8406,  0.2708,  0.3422],
        [ 0.7196,  0.6120,  0.4476,  0.6705],
        [ 0.6989,  0.2086,  0.5100,  0.8285]])

Here is the link to official documentation

like image 823
InAFlash Avatar asked Jul 02 '18 17:07

InAFlash


People also ask

How do you initialize an empty torch tensor?

If you want a Tensor with no data in it. you can create a Tensor with 0 size: x = torch. empty(0, 3) .

How do I know if my torch tensor is empty?

To know whether an allocated tensor has zero elements, use numel() To know whether a tensor is allocated and whether it has zero elements, use defined() and then numel()

What does torch empty mean?

The function torch. empty() returns a tensor filled with uninitialized data. The shape of the tensor is defined by the variable argument size.

What is Torch Full?

The full form of TORCH is toxoplasmosis, rubella cytomegalovirus, herpes simplex, and HIV. However, it can also contain other newborn infections. Sometimes the test is spelled TORCHS, where the extra "S" stands for syphilis.


1 Answers

Once you call torch.empty(), a block of memory is allocated according to the size (shape) of the tensor. By uninitialized data, it's meant that torch.empty() would simply return the values in the memory block as is. These values could be default values or it could be the values stored in those memory blocks as a result of some other operations, which used that part of the memory block before.


Here's a simple illustration:

# a block of memory with the values in it
In [74]: torch.empty(2, 3)
Out[74]: 
tensor([[-1.0049e+08,  4.5688e-41, -9.1450e-38],
        [ 3.0638e-41,  4.4842e-44,  0.0000e+00]])

# same run; but note the change in values.
# i.e. different memory addresses than on the previous run were used.
In [75]: torch.empty(2, 3)
Out[75]: 
tensor([[-1.0049e+08,  4.5688e-41, -7.9421e-38],
        [ 3.0638e-41,  4.4842e-44,  0.0000e+00]])
like image 161
kmario23 Avatar answered Sep 19 '22 13:09

kmario23