Deepdreaming new trainees: Still fuzzy blobs

Follow-up from the previous one about training a new neural network to deepdream about new topics.

I had to let a larger training set of Art History run, in hopes that the expanded dataset would yield a more representational result. There were 2.3 million images, including photometric distortions. Again, the task was classifying which artist did which painting. This collection of images was 26 times larger than the one in my previous post. I only trained for 7 epochs (36 hours) before my spouse said enough was enough with the cloud computing bill and forced me to stop the instances. The resulting trainee seems to be producing more complex organic forms when deep dreamt but still nowhere near as sophisticated as the bvlc_googLeNet (which was stopped after 60 epochs).

Here are the resulting paintings/dreams generated with different settings, but using the same chromatic gradient image as a guide. The first using a lower resolution version of the guide image, fewer octaves, and only 10 iterations.

deepdream 2 art history

The second was 10 octaves with 40 iterations. You can see it really loves the skin color.

deepdream 2 art history

As I saw in some of the super duper high resolution (128 megapixel) puppyslug murals generated by David A Lobser, higher resolution isn't necessarily more interesting. I find the good stuff occurs in a lower resolution, perhaps an image that was roughly the size of one of the training images. Not a coincidence. Unfortunately, that's not so useful for people who want to make large murals.

The sweet spot I seek is the one where the deconv deep vis is showing activation images that resemble a scrambled version of the training images, because my theory is that it leads to a more representational synthesis in the deep dream. I continue on my weekends.

Anyway, more on kitsch. I'm not necessarily saying that kitsch is bad. I have a lava lamp in my house. What I think would be bad about deep learning + art is having a very promising scientific phenomenon trivialized and frozen in public memory as the software that performs the one single task. I think it has more potential than that, and I thank all of you who got in touch with me to say you agree.

deepdream training error graph