I have a naive understanding of things so far. For example, suppose we’re talking about classification problems. For typical SVM, the classification problem turns into looking for a linear boundary between the data (image from wiki): But usually the boundary is not linear, so that’s why kernel methods (kernel svm, logistic regressions) are introduced, in […]

# Archives for **February, 2017**

Here’s a list of different gradient descent optimizers: http://sebastianruder.com/optimizing-gradient-descent The most common one is SDG, which is also the most basic one in TF: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/training/gradient_descent_test.py Other versions are listed in the training folder: https://github.com/tensorflow/tensorflow/tree/master/tensorflow/python/training Although mentioned in the list, the learning ratio of SGD should be gradually decreasing as iteration # goes up, but there’s no […]

I think it’s time to move on to TensorFlow and embrace the DL world now…. So first, why TensorFlow? It’s mainly due to the following comparison from Dr. Matt Rubashkin: A concise summary of deep learning frameworks I chose TF because it supports multi-GPUs and it’s supposed to be easy to setup — and yes […]