Chainer cifar を試す

README 曰く、train_cifar.py は、デフォルトで GPU 0 を使う。

$ python train_cifar.py

Mnist のように、データセットのダウンロードから始まります。

Estimated time to finish: 3:57:33

とでました。

初回の結果

$ python train_cifar.py 
GPU: 0
# Minibatch-size: 128
# epoch: 300

Using CIFAR10 dataset.
Downloading from https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz...
epoch       main/loss   validation/main/loss  main/accuracy  validation/main/accuracy  elapsed_time
1           2.40104     1.94277               0.161325       0.217168                  51.6946       
2           1.81075     1.66216               0.285826       0.347706                  100.377       
3           1.50989     1.42922               0.420052       0.475771                  149.203       
4           1.27429     1.15027               0.543938       0.594343                  198.176       
5           1.0685      1.00839               0.627757       0.655756                  247.159       
6           0.953728    0.983496              0.673437       0.667623                  296.061       
7           0.871836    0.878536              0.710038       0.707476                  345.054       
8           0.825212    0.826754              0.729547       0.733782                  393.962       
9           0.789095    0.813937              0.745165       0.739517                  442.956       
10          0.753746    0.79951               0.757433       0.742089                  491.96        
11          0.729694    0.760835              0.767007       0.762658                  540.833       

 ...

285         0.00550012  0.524492              0.998881       0.899031                  13941.3       
286         0.00420565  0.549105              0.999379       0.89646                   13990.1       
287         0.00410778  0.539779              0.999341       0.897547                  14039         
288         0.00399757  0.536811              0.999419       0.897646                  14087.7       
289         0.00549819  0.532966              0.999021       0.898042                  14136.6       
290         0.00449508  0.525496              0.999341       0.899229                  14185.5       
291         0.00408126  0.528732              0.999459       0.898438                  14234.3       
292         0.00433158  0.529872              0.999361       0.897152                  14283.2       
293         0.00442761  0.534484              0.999321       0.897745                  14332.1       
294         0.00480573  0.52566               0.999179       0.898734                  14380.9       
295         0.0047175   0.535832              0.999321       0.899525                  14429.8       
296         0.00457282  0.548527              0.999299       0.89468                   14478.7       
297         0.00466009  0.538513              0.999241       0.896657                  14527.7       
298         0.00482552  0.535228              0.999201       0.898438                  14576.8       
299         0.00517972  0.528672              0.998978       0.899328                  14625.6       
300         0.00443006  0.533218              0.999261       0.899723                  14674.5

f:id:pongsuke:20170317111704p:plain