Symbol Augmentation with Keras Preprocessing Layers and tf.symbol

Symbol Augmentation with Keras Preprocessing Layers and tf.symbol

Closing Up to date on August 6, 2022

While you paintings on a gadget studying drawback similar to pictures, no longer best do you want to assemble some pictures as coaching knowledge, however you additionally wish to make use of augmentation to create permutations within the symbol. It’s very true for extra complicated object reputation issues.

There are lots of techniques for symbol augmentation. It’s possible you’ll use some exterior libraries or write your personal purposes for that. There are some modules in TensorFlow and Keras for augmentation too.

On this publish, you are going to uncover how you’ll use the Keras preprocessing layer in addition to the tf.symbol module in TensorFlow for symbol augmentation.

After studying this publish, you are going to know:

  • What are the Keras preprocessing layers, and tips on how to use them
  • What are the purposes equipped by way of the tf.symbol module for symbol augmentation
  • How one can use augmentation along side the tf.knowledge dataset

Let’s get began.

Symbol Augmentation with Keras Preprocessing Layers and tf.symbol

Symbol augmentation with Keras preprocessing layers and tf.symbol.
Photograph by way of Steven Kamenar. Some rights reserved.


This text is split into 5 sections; they’re:

  • Getting Photographs
  • Visualizing the Photographs
  • Keras Preprocessing Layers
  • The usage of tf.symbol API for Augmentation
  • The usage of Preprocessing Layers in Neural Networks

Getting Photographs

Sooner than you notice how you’ll do augmentation, you want to get the pictures. In the end, you want the pictures to be represented as arrays, for instance, in HxWx3 in 8-bit integers for the RGB pixel price. There are lots of techniques to get the pictures. Some can also be downloaded as a ZIP record. When you’re the usage of TensorFlow, you can get some symbol datasets from the tensorflow_datasets library.

On this educational, you are going to use the citrus leaves pictures, which is a small dataset of lower than 100MB. It may be downloaded from tensorflow_datasets as follows:

Working this code the primary time will obtain the picture dataset into your laptop with the next output:

The serve as above returns the pictures as a tf.knowledge dataset object and the metadata. This can be a classification dataset. You’ll print the learning labels with the next:

This prints:

When you run this code once more at a later time, you are going to reuse the downloaded symbol. However the opposite direction to load the downloaded pictures right into a tf.knowledge dataset is to make use of the image_dataset_from_directory() serve as.

As you’ll see from the display output above, the dataset is downloaded into the listing ~/tensorflow_datasets. When you take a look at the listing, you notice the listing construction as follows:

The directories are the labels, and the pictures are information saved below their corresponding listing. You’ll let the serve as to learn the listing recursively right into a dataset:

It’s possible you’ll need to set batch_size=None if you don’t want the dataset to be batched. Generally, you wish to have the dataset to be batched for coaching a neural community style.

Visualizing the Photographs

You will need to visualize the augmentation consequence, so you’ll check the augmentation result’s what we would like it to be. You’ll use matplotlib for this.

In matplotlib, you could have the imshow() serve as to show a picture. Alternatively, for the picture to be displayed appropriately, the picture must be offered as an array of 8-bit unsigned integers (uint8).

For the reason that you could have a dataset created the usage of image_dataset_from_directory()You’ll get the primary batch (of 32 pictures) and show a couple of of them the usage of imshow(), as follows:

Right here, you notice a show of 9 pictures in a grid, classified with their corresponding classification label, the usage of ds.class_names. The pictures must be transformed to NumPy array in uint8 for show. This code shows a picture like the next:

All the code from loading the picture to show is as follows:

Word that when you’re the usage of tensorflow_datasets to get the picture, the samples are offered as a dictionary as an alternative of a tuple of (symbol,label). You must alternate your code moderately to the next:

For the remainder of this publish, suppose the dataset is created the usage of image_dataset_from_directory(). It’s possible you’ll wish to tweak the code moderately in case your dataset is created in a different way.

Keras Preprocessing Layers

Keras comes with many neural community layers, equivalent to convolution layers, that you want to coach. There also are layers and not using a parameters to coach, equivalent to flatten layers to transform an array like a picture right into a vector.

The preprocessing layers in Keras are particularly designed to make use of within the early phases of a neural community. You’ll use them for symbol preprocessing, equivalent to to resize or rotate the picture or alter the brightness and distinction. Whilst the preprocessing layers are meant to be a part of a bigger neural community, you’ll additionally use them as purposes. Beneath is how you’ll use the resizing layer as a serve as to become some pictures and show them side-by-side with the unique:

The pictures are in 256×256 pixels, and the resizing layer will cause them to into 256×128 pixels. The output of the above code is as follows:

Because the resizing layer is a serve as, you’ll chain them to the dataset itself. As an example,

The dataset ds has samples within the type of (symbol, label). Therefore you created a serve as that takes in such tuple and preprocesses the picture with the resizing layer. Then you definitely assigned this serve as as an issue for the map() within the dataset. While you draw a pattern from the brand new dataset created with the map() serve as, the picture might be a reworked one.

There are extra preprocessing layers to be had. Some are demonstrated under.

As you noticed above, you’ll resize the picture. You’ll additionally randomly magnify or shrink the peak or width of a picture. In a similar way, you’ll zoom in or zoom out on a picture. Beneath is an instance of manipulating the picture measurement in quite a lot of techniques for a most of 30% building up or lower:

This code presentations pictures as follows:

Whilst you specified a hard and fast measurement in resize, you could have a random quantity of manipulation in different augmentations.

You’ll additionally do flipping, rotation, cropping, and geometric translation the usage of preprocessing layers:

This code presentations the next pictures:

And in spite of everything, you’ll do augmentations on colour changes as smartly:

This presentations the pictures as follows:

For completeness, under is the code to show the results of quite a lot of augmentations:

In the end, you will need to indicate that almost all neural community fashions can paintings higher if the enter pictures are scaled. Whilst we most often use an 8-bit unsigned integer for the pixel values in a picture (e.g., for show the usage of imshow() as above), a neural community prefers the pixel values to be between 0 and 1 or between -1 and +1. This can also be achieved with preprocessing layers too. Beneath is how you’ll replace some of the examples above so as to add the scaling layer into the augmentation:

The usage of tf.symbol API for Augmentation

But even so the preprocessing layer, the tf.symbol module additionally supplies some purposes for augmentation. Not like the preprocessing layer, those purposes are supposed for use in a user-defined serve as and assigned to a dataset the usage of map() as we noticed above.

The purposes equipped by way of the tf.symbol aren’t duplicates of the preprocessing layers, even supposing there may be some overlap. Beneath is an instance of the usage of the tf.symbol purposes to resize and crop pictures:

Beneath is the output of the above code:

Whilst the show of pictures suits what you could be expecting from the code, the usage of tf.symbol purposes is slightly other from that of the preprocessing layers. Each tf.symbol serve as is other. Due to this fact, you’ll see the crop_to_bounding_box() serve as takes pixel coordinates, however the central_crop() serve as assumes a fragment ratio because the argument.

Those purposes also are other in the way in which randomness is treated. A few of these purposes don’t suppose random habits. Due to this fact, the random resize must have the precise output measurement generated the usage of a random quantity generator one at a time ahead of calling the resize serve as. Any other purposes, equivalent to stateless_random_crop(), can do augmentation randomly, however a couple of random seeds within the int32 must be specified explicitly.

To proceed the instance, there are the purposes for flipping a picture and extracting the Sobel edges:

This presentations the next:

And the next are the purposes to govern the brightness, distinction, and hues:

This code presentations the next:

Beneath is your complete code to show the entire above:

Those augmentation purposes must be sufficient for many makes use of. However you probably have some explicit concepts on augmentation, you possibly can most definitely want a greater symbol processing library. OpenCV and Pillow are commonplace however tough libraries that mean you can become pictures higher.

The usage of Preprocessing Layers in Neural Networks

You used the Keras preprocessing layers as purposes within the examples above. However they are able to even be used as layers in a neural community. It’s trivial to make use of. Beneath is an instance of ways you’ll incorporate a preprocessing layer right into a classification community and teach it the usage of a dataset:

Working this code offers the next output:

Within the code above, you created the dataset with cache() and prefetch(). This can be a efficiency option to permit the dataset to arrange knowledge asynchronously whilst the neural community is skilled. This is able to be important if the dataset has any other augmentation assigned the usage of the map() serve as.

You are going to see some growth in accuracy when you take away the RandomFlip and RandomRotation layers as a result of you’re making the issue more straightforward. Alternatively, as you wish to have the community to are expecting smartly on a large variation of symbol high quality and houses, the usage of augmentation can assist your ensuing community transform extra tough.

Additional Studying

Beneath is a few documentation from TensorFlow this is associated with the examples above:


On this publish, you could have noticed how you’ll use the tf.knowledge dataset with symbol augmentation purposes from Keras and TensorFlow.

In particular, you realized:

  • How one can use the preprocessing layers from Keras, each as a serve as and as a part of a neural community
  • How one can create your personal symbol augmentation serve as and use it on the dataset the usage of the map() serve as
  • How one can use the purposes equipped by way of the tf.symbol module for symbol augmentation

Increase Deep Finding out Tasks with Python!

Deep Learning with Python

 What If You May just Increase A Community in Mins

…with only some strains of Python

Uncover how in my new Book:

Deep Finding out With Python

It covers end-to-end tasks on subjects like:

Multilayer PerceptronsConvolutional Nets and Recurrent Neural Nets, and extra…

In the end Convey Deep Finding out To

Your Personal Tasks

Skip the Teachers. Simply Effects.

See What is Inside of