What is a data tensor?

We have seen how to create a tensor. Not only that, we have also seen what is the difference between a scalar, a vector and a tensor.

We have also seen what a matrix is.

In that sense, a vector or a matrix is also a tensor. 

However, by the term tensor, we can always refer to multi-dimensional arrays.

Firstly, we can create a tensor with the TensorFlow. 

Secondly, we can extract the data information from a tensor quite easily.

Let’s see a code sample.

import tensorflow as tf
tf

# output:
<module 'tensorflow' from '/usr/local/lib/python3.7/dist-packages/tensorflow/__init__.py'>

But above all, we should know the tensor vocabulary first.

Shape of a tensor refers to the length, or the number of elements present in each dimension of a tensor. 

How do we know the dimension?

Simple. 

A scalar has 0 dimension. On the other hand a vector has 1 dimension.

We have discussed it in previous sections.

The tensor vocabulary has the word “Rank” that refers to the number of tensor dimensions.

Like scalar has Rank 0.

When we say, Axis or Dimension, we refer to a particular dimension.

Let’s come across the idea of Size. 

The Size of a tensor refers to the total number of items in a tensor.  

TensorFlow attributes

We have discussed TensorFlow attributes before. However, using those attributes, we can extract data information.

Let’s see the code. We’re going to create five dimensional tensor.

# Create a rank 5 tensor (5 dimensions)
rank_five_tensor = tf.zeros([2, 3, 4, 5, 6])
rank_five_tensor

The output is quite big. Still we can see a chunk of it.

You can find the rest in this GitHub repository.

<tf.Tensor: shape=(2, 3, 4, 5, 6), dtype=float32, numpy=
array([[[[[0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.]],

         [[0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.]],

....
.....
[[0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.]],

         [[0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.],
          [0., 0., 0., 0., 0., 0.]]]]], dtype=float32)>

# code is incomplete

Now we can see the main information at one go.

rank_five_tensor.shape, rank_five_tensor.ndim, tf.size(rank_five_tensor)

# output:
(TensorShape([2, 3, 4, 5, 6]),
 5,
 <tf.Tensor: shape=(), dtype=int32, numpy=720>)

We get the shape, and number of dimensions which is 5. And finally we get the total elements which is 720.

Next we can get main attributes in one place. 

# Getting various attributes of the tensor
print("Datatype of each element:", rank_five_tensor.dtype)
print("Number of dimensions which is known as rank:", rank_five_tensor.ndim)
print("The Shape of tensor:", rank_five_tensor.shape)
print("Elements along axis 0 of the tensor:", rank_five_tensor.shape[0])
print("Elements along last axis of the tensor:", rank_five_tensor.shape[-1])
print("Total number of elements (2*3*4*5*6):", tf.size(rank_five_tensor).numpy()) 

# output:
Datatype of each element: <dtype: 'float32'>
Number of dimensions which is known as rank: 5
The Shape of tensor: (2, 3, 4, 5, 6)
Elements along axis 0 of the tensor: 2
Elements along last axis of the tensor: 6
Total number of elements (2*3*4*5*6): 720

Certainly we can do more.

Let’s see the code.

# Get the first 1 item of each dimension
rank_five_tensor[:1, :1, :1, :1, :1]

# output:
<tf.Tensor: shape=(1, 1, 1, 1, 1), dtype=float32, numpy=array([[[[[0.]]]]], dtype=float32)>

rank_five_tensor[:, :1, :1, :1]
# getting the dimension from each index except for the first one

# output:
<tf.Tensor: shape=(2, 1, 1, 1, 6), dtype=float32, numpy=
array([[[[[0., 0., 0., 0., 0., 0.]]]],



       [[[[0., 0., 0., 0., 0., 0.]]]]], dtype=float32)>

We will see later what we can do with this tensors later.

So stay tuned.

What Next?

Books at Leanpub

Books in Apress

My books at Amazon

GitHub repository

TensorFlow, Machine Learning, AI and Data Science

Flutter, Dart and Algorithm

Twitter

Comments

Leave a Reply