Overview

Visualizing The Cost Function

Let’s take a look at how various optimization algorithms stack up against each other. To do that, we’ll have a look at the visualization of losses that we’ve logged. As it is clearly visible from the plot, Adam optimizer starts to converge faster than any other optimizer but in the end LBFGS is the clear winner if minimizing the loss is the comparison metric.

Adagrad executes fairly smoothly but it takes much longer to converge. RMSProp is the most unstable optimizer among these four optimizers as it is clearly visible from its plot.

Deduction If minimizing the loss function has the highest priority, the best optimizer would be LBFGS. We’ve easily deduced this by comparing the plot generated by wandb.

But in style transfer, the loss metric does not capture the whole picture. Remember that we’re building a style transfer app for production and choosing a technique over another is fairly subjective as “style” is subjective. So, now let’s compare the results. We’ve logged 50 outputs for each execution.

Let’s compare them.

Visualizing Results at Index 15

Wandb allows up to generate and save reports by choosing and comparing specific runs. I’ve saved a report which compares the outputs of various indices. Let’s have a look at index 15.

Looking at the results, we can easily conclude that RMSprop optimizer is the outlier among the others as the output generated by it has a lot of distortions without the visible signs of transference of style. This can be fixed by tuning the hyper-parameters but for the sake of experimentation, we’ll hold the parameters constant. The image looks dull and faded. Again, it is purely subjective and might be considered as a “whitened” version of the image. So, we can use this optimizer when trying to apply a whitening filter. Adagrad’s output has sharp contrast but as the convergence its convergence is slow, the style is not yet prominent when compared with the outputs of LBGFS and Adam.

Deduction

Visualizing Results at Index 25

The LBGFS optimizer has achieved the highest level of style transfer, closely followed by Adam. RMSProp again seems to be the obvious outlier with white-filter touch. Adagrad has the least prominent style features.

Deduction If the speed for execution(number of epochs) is concerned, we should go for LBGFS as our optimizer of choice.

Visualizing the Final Results

Let’s have a look at the final output of our algorithm

Final Deduction

Conclusion

Now we have sufficient insights to choose the optimizers accordingly to build our application. We can even offer multiple forms of style transfer as we’ve got different yet consistent results in all the 4 experiment.

Finally, we can easily tune this algorithm by changing even one of many parameters. I’d recommend that you try to change the hyper-parameters and compare the executions at various levels. Wandb makes it really easy to compare and log the results. So go ahead and experiment!

Exploring Neural Style Transfer Paper With W&B

In this tutorial, we’ll go through the neural style transfer algorithm by Gatys, implement it and track it using W&B library. Let’s assume that we’re building a style transfer app for production. We’ll need to compare the results generated by changing various parameters. This requires subjective comparison because we cannot use an accuracy metric as no style transfer result is more “accurate” than the other. So, we’ll need to choose the parameters according to our preference and this requires side-by-side comparison which can easily be done using wandb library.

Reproducing This Analysis

Reproducing This Analysis

We encourage you to read more about style transfer and play with the code to see if you can achieve interesting results using style transfer.