Why Google built TPU instead invent some superpower GPU?



Deep learning researchers always think training is the core problem. Because they always lack funds to purchase the quickest machines. But Google doesn’t worry this, they just have tons of powerful machines, find resources to train a good model isn’t very hard for Google.

Win some deep learning contests isn’t the goal of Google, it is just their PR tricks. Google want to provide AI cloud services. So they kept releasing their well-trained models, Inception-v3, Word2vec, etc. Most of the customers will use API from Google’s well-trained models, like Cloud Natural Language API, Cloud Speech API, Cloud Translation API, Cloud Vision API, Cloud Video Intelligence API. Some of them will want to use models that provide by Google or other companies, or just do some fine tune. And only a little of them will want to train their model all from the beginning.

So, Google cares about service more than training, so they build TPU to speed up service, to reduce service latency.



How to install Theano on Mac OS X EI Capitan with OpenCL support



I have two Mac, a Mac pro and Macbook pro. They both use AMD display card, so I can’t use CUDA speed up machine learning, so I want to use OpenCL.

First install some requirements:

Then make install environment ( use virtualenv can make everything easy ):

Install Theano:

Then you can use this demo code ( test.py ) to test your Theano:

This test will show something like this:

It just do some math 1000 time, took 1.492254 seconds, and use cpu.

Then we must install libgpuarray to make Theano support OpenCL:

Then we can use THEANO_FLAGS to set use OpenCL device.

Use OpenCL & Cpu in my Mac pro:

Note: Code showed “Used the gnu”, but as you can see, this was using CPU.

Use OpenCL & GPU 1 in My Mac pro:

Finally, we can use OpenCL and GPU.

Performance:

My Mac pro

屏幕截图 2016-04-01 15.09.25

My Macbook pro

屏幕截图 2016-04-01 15.09.32

Note: Now we can use Theano and OpenCL. But I was very sad, when I found out Theano is using CUNN, so when you use Theano to do Neural Network, we still need CUDA support. Just for now I don’t know how to solve this problem.

Chinese version of this article.