In this PyTorch lesson, we will use the sqrt() method to return the square root of each element in the tensor. An open-source framework called PyTorch is released together with the Python programming language. The data is kept in a multidimensional array called a tensor. Additionally, we must import the torch module to use a tensor.
The procedure used to produce a tensor is called tensor().
PyTorch – Sqrt() Syntax
The syntax is as follows:
torch.tensor(data)
Where the data is an array with several dimensions.
Output
It outputs a tensor containing the square-root reciprocal.
Steps
Add the necessary libraries by importing those modules.
The necessary Python library for each of the ensuing examples is the torch. Also, please make sure that it is already installed.
import torch
Create and print a torch tensor.
input_var = torch.randn(8,9) print("The Tensor's Input is:\n", input_var)
Use the torch to calculate the reciprocal of each input tensor component’s square root.
rsqrt(input_var)
The input tensor is called the input_var in this case.
var_recip = torch.rsqrt(input_var)
Show the reciprocal values for the computed tensor
print("The Reciprocal SQRT Tensor is as follows:\n", var_recip)
Torch.sqrt()
The PyTorch function sqrt() returns the square root of each element in the tensor. It needs just one parameter. The syntax is as follows:
torch.sqrt(tensor_object)
Parameter
The tensor_object parameter in the sqrt() function is a tensor.
Example: One dimensional tensor with five elements
In this example, we’ll make a tensor with one dimension and five elements, and we’ll return the square roots of these five elements in a tensor.
#start by importing the torch module import torch #creating a new a tensor varData = torch.tensor([17,39,61,6,15]) #show the real tensor print("The actual Tensor is: ") print(varData) print("The resulting Square Root is: ") print(torch.sqrt(varData))
Example: Two Dimensional tensors with five elements
In this example, we will generate a two-dimensional tensor with five elements per row and return the elements’ square root.
#begin by importing the torch module import torch #creation of a 2D tensor varData=torch.tensor([[50,72,26,28,7],[7,8,9,10,11]]) #show the real tensor print("The Actual Tensor is: ") print(varData) print("The resulting Square Root is:") print(torch.sqrt(varData))
Utilizing a CPU
We must create a tensor with a cpu() function to execute an sqrt() function on the CPU. It will operate on a computer with a CPU. This time, we can use the cpu() function when building a tensor. The syntax is as follows:
torch.tensor(data).cpu()
Example: One-dimensional tensor with five elements on the CPU
In this example, we’ll build a tensor with one dimension and five elements on the computer, then return the tensor’s square roots for these five members.
#commence by importing the torch module import torch #creation of a tensor varData = torch.tensor([17,39,61,6,15]).cpu() #Show the real tensor print("The Actual Tensor is:") print(varData) print("The Square Root is: ") print(torch.sqrt(varData))
Example: 2-dimensional tensor with five elements on the cput
In this example, we’ll generate a two-dimensional tensor with five elements in each row on the CPU and return the square root of the elements.
#start by importing the torch module import torch #creation of a 2D tensor varData=torch.tensor([[50,72,26,28,7],[7,8,9,10,11]]).cpu() #show the actual tensor print("The real tensor is: ") print(varData) print("The resultant square root is:") print(torch.sqrt(varData))
Example: Computing the reciprocal of the square-root
This Python application calculates the reciprocal of the square root of input tensors with real and complex values.
# start by Importing the needed library import torch # defining the input tensor var_input = torch.tensor([6.2, 8., 9., 9.2, -2.2]) # printing the defined tensor from above print("The resultant input tensor is:\n", var_input) # computation of the square root's reciprocal var_recip = torch.rsqrt(var_input) # printing the tensor resulting from the computation above print("The SQRT Tensor Reciprocal is:\n", var_recip) print("............................") # definition of a complex input tensor var_input = torch.tensor([1.2+2j, 3.+4.j, 4.2-3.2j]) # printing the defined tensor from above print("The resultant input tensor is:\n", var_input) # computing the square root's reciprocal var_recip = torch.rsqrt(var_input) # print the above-computed tensor print("The SQRT Tensor's Reciprocal is :\n", var_recip)
Keep in mind that the Reciprocal SQRT tensor member that corresponds to a zero in the input tensor is NaN.
Example:
# begin by importing the required library import torch # definition of an input tensor var_input = torch.randn(8,9) # printing of the defined tensor from above print("The Input Tensor is :\n", var_input) # compute the reciprocal of the square root var_recip = torch.rsqrt(var_input) # printing the computed tensor from above print("The SQRT Tensor's Reciprocal is:\n", var_recip) print("......................................") # definition of an input tensor that is complex var_real = torch.randn(8,8) var_imag = torch.randn(8,8) var_input = torch.complex(var_real, var_imag) # printing of the defined tensor shown above print("The resulting Input Tensor is:\n", var_input) # computing the square root's reciprocal from above var_recip = torch.rsqrt(var_input) # printing the computed tensor as shown above print("The SQRT Tensor's Reciprocal is:\n", var_recip)
Example: AWS overrides the function _load_from_state_dict to recover weight_gamma and weight_beta
def _load_from_state_dict(self, var_state_dict, var_prefix, var_local_metadata, var_strict, var_missing_keys, var_unexpected_keys, error_msgs): """ Replace the built-in load function. If weight_gamma and weight_beta are absent, AWS overrides the function _load_from_state_dict to retrieve them. After super()._load_from_state_dict, this method will return if weight_gamma and weight_beta are found in the checkpoint. If not, it will calculate the pretrained weights' mean and standard deviation and save those results in weight_beta and weight_gamma. """ self.weight_gamma.data.fill_(-1) var_local_missing_keys = [] super()._load_from_state_dict(var_state_dict, var_prefix, var_local_metadata, var_strict, var_local_missing_keys, var_unexpected_keys, error_msgs) if self.var_weight_gamma.data.mean() > 0: for k in var_local_missing_keys: var_missing_keys.append(k) return var_weight = self.weight.data var_weight_flat = var_weight.view(var_weight.size(0), -1) var_mean = var_weight_flat.mean(dim=1).view(-1, 1, 1, 1) var_std = torch.sqrt(var_weight_flat.var(dim=1) + 1e-5).view(-1, 1, 1, 1) self.var_weight_beta.data.copy_(var_mean) self.var_weight_gamma.data.copy_(var_std) var_missing_gamma_beta = [ k for k in local_missing_keys if k.endswith('var_weight_gamma') or k.endswith('var_weight_beta') ] for k in var_missing_gamma_beta: var_local_missing_keys.remove(k) for k in local_missing_keys: var_missing_keys.append(k)
Example: Forward pass
def forward(self, invisible_states, attn_mask): mxd_query_layer = self.query(invisible_states) mxd_key_layer = self.key(invisible_states) mxd_value_layer = self.value(invisible_states) query_layer = self.transpose_for_scores(mxd_query_layer) key_layer = self.transpose_for_scores(mxd_key_layer) value_layer = self.transpose_for_scores(mxd_value_layer) # To obtain the raw attention scores, assume the dot product between "query" and "key." attn_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2)) attn_scores = attn_scores / math.sqrt(self.attn_head_size) # Apply the attention mask, which BertModel's forward() function has already computed for all layers. attn_scores = attn_scores + attn_mask # the attention scores should be normalized to probabilities. attn_probs = nn.Softmax(dim=-1)(attn_scores) # This is essentially removing all of the tokens to handle it, which could # is derived from the original Transformer paper, although it may appear slightly odd. attn_probs = self.dropout(attn_probs) contxt_layer = torch.matmul(attn_probs, value_layer) contxt_layer = contxt_layer.permute(0, 2, 1, 3).contiguous() new_contxt_layer_shape = context_layer.size()[:-2] + (self.all_head_size,) contxt_layer = contxt_layer.view(*new_contxt_layer_shape) return contxt_layer
Conclusion
The sqrt() method was covered in this PyTorch -sqrt() article. Each member of the input tensor’s square-root reciprocal is calculated using the torch.rsqrt() method. Every component of the PyTorch tensor is returned along with its square root. Further, it accepts inputs with real and complex values.
The matching element in the output tensor is NaN if an element in the input tensor is zero. To use the sqrt() function, we explored several examples using various dimensional tensors.