2017/10/19

Network debug 方法

markdown ## linux 環境下 查詢所有 process ``` ps ``` 輸出 ``` PID TTY TIME CMD 21968 ttys000 0:00.16 -bash ``` 查詢 port 3000 被誰占用 ``` lsof -i :3000 ``` 輸出 ``` COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME ruby 20308 etrex 25u IPv4 0xd000000000000 0t0 TCP *:hbci (LISTEN) ``` 刪除某個 process ``` kill 20308 ``` ## 參考文件 kill process:[https://blog.gtwang.org/linux/linux-kill-killall-xkill/](https://blog.gtwang.org/linux/linux-kill-killall-xkill/)

Ruby - RSpec

markdown ## RSpec 簡介 RSpec 在執行的時候不保證執行順序,每個測試產生的資料不會自動被清除。 簡單的範例 ``` require 'rails_helper' RSpec.describe "規格說明" do describe "處於某個狀態下" do # 設定狀態變數 let(:a) { 1 } it "should be 1" do puts "should be 1" expect(a).to eq(1) end end end ``` let 在每次測試裡,第一次存取變數時就會執行對應的程式 ``` require 'rails_helper' RSpec.describe "規格說明" do describe "處於某個狀態下" do let(:a) { puts "let a"; 1 } it "1" do puts "1" puts "a=#{a}" end it "2" do puts "2" puts "a=#{a}" puts "a=#{a}" puts "a=#{a}" end end end ``` 輸出 ``` 1 let a a=1 .2 let a a=1 a=1 a=1 . ``` before 和 after 會在所有測試的執行前後做事 ``` RSpec.describe "規格說明" do describe "處於某個狀態下" do before { puts "before" } after { puts "after" } it "1" do puts "1" end it "2" do puts "2" end end end ``` 執行結果 ``` before 1 after .before 2 after . ``` 因為測試對資料庫的操作會互相影響,如果想要確保每個測試都是在資料庫乾淨的狀態下,可以使用 [database_cleaner](https://github.com/DatabaseCleaner/database_cleaner)。 ``` RSpec.configure do |config| config.before(:suite) do DatabaseCleaner.strategy = :transaction DatabaseCleaner.clean_with(:truncation) end config.around(:each) do |example| DatabaseCleaner.cleaning do example.run end end end ```

Ruby - debug 方法

markdown ## 查詢繼承關係 ``` File.ancestors # [File, IO, File::Constants, Enumerable, Object, Kernel, BasicObject] ``` ## 從物件找方法 ``` # 查 File 的類別方法 File.methods # 查 File 的實體方法 File.instance_methods # 查 File 的實體方法 File.new('/').methods # 取得繼承樹上所有的方法 File.methods(true) # 只取得屬於 File 的方法 File.methods(false) ``` ## 從方法找定義 ``` File.method(:read) # #Method: File(IO).read IO.method(:read) # #Method: IO.read IO.method(:read).source_location # nil ``` 因為 IO.read 的定義是寫在 c 語言,所以就不顯示了。 ## 參考文件 為什麼File.method(:read)是nil:[https://ja.stackoverflow.com/questions/5755/ruby-file-read-%E3%83%A1%E3%82%BD%E3%83%83%E3%83%89%E3%81%AE%E8%AA%AC%E6%98%8E%E3%82%92api%E3%83%89%E3%82%AD%E3%83%A5%E3%83%A1%E3%83%B3%E3%83%88%E3%81%A7%E8%AA%BF%E3%81%B9%E3%81%9F%E3%81%84](https://ja.stackoverflow.com/questions/5755/ruby-file-read-%E3%83%A1%E3%82%BD%E3%83%83%E3%83%89%E3%81%AE%E8%AA%AC%E6%98%8E%E3%82%92api%E3%83%89%E3%82%AD%E3%83%A5%E3%83%A1%E3%83%B3%E3%83%88%E3%81%A7%E8%AA%BF%E3%81%B9%E3%81%9F%E3%81%84)

Python - 電腦字體數字辨識

markdown 續前篇:[Python - draw text on image and image to numpy array](http://etrex.blogspot.tw/2017/10/python-draw-text-on-image-and-image-to.html) ## 目標 嘗試建立一個簡單的類神經網路,只針對電腦字體數字 0~9 共 10 張圖片作訓練,訓練資料就是測試資料,在 40 次左右的訓練後正確度可達到 100%。 ## 本文包含 * keras 的使用 * keras Model 輸出成圖片 ## 程式碼 ``` import numpy as np from PIL import Image from PIL import ImageFont from PIL import ImageDraw image_size = (8,13) font_size = 10 x_train = np.array([]) y_train = np.array([]) for i in range(10): # 空白圖片生成 image = Image.new('L', image_size, 0) # 取得繪圖器 draw = ImageDraw.Draw(image) # 字型設定 # font = ImageFont.truetype("C:/Windows/Fonts/cour.ttf", font_size) font = ImageFont.truetype("C:/Windows/Fonts/msjh.ttc", font_size) # 關閉反鋸齒 draw.fontmode = '1' # 測量文字尺寸 text_size = draw.textsize(str(i),font) # print('text_size:', text_size) # 文字置中 text_position = ((image_size[0]-text_size[0])//2,(image_size[1]-text_size[1])//2) # print('text_position:', text_position) # 畫上文字 draw.text(text_position, str(i), 255, font) # 存檔 image.save(str(i)+'.bmp') # 轉成 numpy array na = np.array(image.getdata()).reshape(image.size[1], image.size[0]) # 加入訓練資料 x_train = np.append(x_train,na) y_train = np.append(y_train,i) import keras from keras.models import Sequential from keras.layers import Dense from keras.optimizers import RMSprop # 每次更新時所採用的資料筆數 batch_size = 1 # 總共有幾種數字 num_classes = 10 # 要更新幾次 epochs = 100 # 把資料轉成 model 需要的格式 x_train = x_train.reshape(10, 104) x_train = x_train.astype('float32') x_train /= 255 print(x_train.shape[0], 'train samples') y_train = keras.utils.to_categorical(y_train, num_classes) # 建立 model model = Sequential() # 一層 + softmax model.add(Dense(num_classes, input_shape=(104,), activation='softmax')) # ?? model.summary() # 選擇 loss function 和 optimizer model.compile(loss='categorical_crossentropy', optimizer=RMSprop(), metrics=['accuracy']) # 開始訓練 history = model.fit(x_train, y_train, batch_size=batch_size, epochs=epochs, verbose=1, validation_data=(x_train, y_train)) # 計算分數 score = model.evaluate(x_train, y_train, verbose=0) print('Test loss:', score[0]) print('Test accuracy:', score[1]) # 儲存 model 至圖片 from keras.utils import plot_model plot_model(model, to_file='model.png') ``` ## 輸出 ``` Using TensorFlow backend. 10 train samples _________________________________________________________________ Layer (type) Output Shape Param # ================================================================= dense_1 (Dense) (None, 10) 1050 ================================================================= Total params: 1,050 Trainable params: 1,050 Non-trainable params: 0 _________________________________________________________________ Train on 10 samples, validate on 10 samples Epoch 1/100 10/10 [==============================] - ETA: 0s - loss: 2.5473 - acc: 0.0000e+00 - val_loss: 2.4555 - val_acc: 0.1000 Epoch 2/100 10/10 [==============================] - ETA: 0s - loss: 2.4563 - acc: 0.1000 - val_loss: 2.3924 - val_acc: 0.1000 Epoch 3/100 10/10 [==============================] - ETA: 0s - loss: 2.3926 - acc: 0.1000 - val_loss: 2.3334 - val_acc: 0.2000 Epoch 4/100 10/10 [==============================] - ETA: 0s - loss: 2.3336 - acc: 0.2000 - val_loss: 2.2760 - val_acc: 0.2000 Epoch 5/100 10/10 [==============================] - ETA: 0s - loss: 2.2766 - acc: 0.2000 - val_loss: 2.2196 - val_acc: 0.2000 Epoch 6/100 10/10 [==============================] - ETA: 0s - loss: 2.2203 - acc: 0.2000 - val_loss: 2.1636 - val_acc: 0.4000 Epoch 7/100 10/10 [==============================] - ETA: 0s - loss: 2.1640 - acc: 0.4000 - val_loss: 2.1090 - val_acc: 0.5000 Epoch 8/100 10/10 [==============================] - ETA: 0s - loss: 2.1096 - acc: 0.4000 - val_loss: 2.0545 - val_acc: 0.5000 Epoch 9/100 10/10 [==============================] - ETA: 0s - loss: 2.0556 - acc: 0.5000 - val_loss: 2.0013 - val_acc: 0.5000 Epoch 10/100 10/10 [==============================] - ETA: 0s - loss: 2.0023 - acc: 0.5000 - val_loss: 1.9484 - val_acc: 0.5000 Epoch 11/100 10/10 [==============================] - ETA: 0s - loss: 1.9493 - acc: 0.5000 - val_loss: 1.8971 - val_acc: 0.5000 Epoch 12/100 10/10 [==============================] - ETA: 0s - loss: 1.8985 - acc: 0.5000 - val_loss: 1.8453 - val_acc: 0.5000 Epoch 13/100 10/10 [==============================] - ETA: 0s - loss: 1.8464 - acc: 0.5000 - val_loss: 1.7949 - val_acc: 0.5000 Epoch 14/100 10/10 [==============================] - ETA: 0s - loss: 1.7962 - acc: 0.5000 - val_loss: 1.7448 - val_acc: 0.5000 Epoch 15/100 10/10 [==============================] - ETA: 0s - loss: 1.7464 - acc: 0.5000 - val_loss: 1.6958 - val_acc: 0.5000 Epoch 16/100 10/10 [==============================] - ETA: 0s - loss: 1.6971 - acc: 0.5000 - val_loss: 1.6471 - val_acc: 0.5000 Epoch 17/100 10/10 [==============================] - ETA: 0s - loss: 1.6486 - acc: 0.5000 - val_loss: 1.5994 - val_acc: 0.5000 Epoch 18/100 10/10 [==============================] - ETA: 0s - loss: 1.6007 - acc: 0.5000 - val_loss: 1.5520 - val_acc: 0.7000 Epoch 19/100 10/10 [==============================] - ETA: 0s - loss: 1.5538 - acc: 0.7000 - val_loss: 1.5064 - val_acc: 0.7000 Epoch 20/100 10/10 [==============================] - ETA: 0s - loss: 1.5078 - acc: 0.7000 - val_loss: 1.4612 - val_acc: 0.7000 Epoch 21/100 10/10 [==============================] - ETA: 0s - loss: 1.4629 - acc: 0.7000 - val_loss: 1.4168 - val_acc: 0.7000 Epoch 22/100 10/10 [==============================] - ETA: 0s - loss: 1.4190 - acc: 0.7000 - val_loss: 1.3736 - val_acc: 0.7000 Epoch 23/100 10/10 [==============================] - ETA: 0s - loss: 1.3758 - acc: 0.7000 - val_loss: 1.3311 - val_acc: 0.8000 Epoch 24/100 10/10 [==============================] - ETA: 0s - loss: 1.3331 - acc: 0.8000 - val_loss: 1.2891 - val_acc: 0.8000 Epoch 25/100 10/10 [==============================] - ETA: 0s - loss: 1.2913 - acc: 0.8000 - val_loss: 1.2488 - val_acc: 0.9000 Epoch 26/100 10/10 [==============================] - ETA: 0s - loss: 1.2513 - acc: 0.9000 - val_loss: 1.2091 - val_acc: 0.9000 Epoch 27/100 10/10 [==============================] - ETA: 0s - loss: 1.2114 - acc: 0.9000 - val_loss: 1.1701 - val_acc: 0.9000 Epoch 28/100 10/10 [==============================] - ETA: 0s - loss: 1.1721 - acc: 0.9000 - val_loss: 1.1324 - val_acc: 0.9000 Epoch 29/100 10/10 [==============================] - ETA: 0s - loss: 1.1349 - acc: 0.9000 - val_loss: 1.0957 - val_acc: 0.9000 Epoch 30/100 10/10 [==============================] - ETA: 0s - loss: 1.0984 - acc: 0.9000 - val_loss: 1.0596 - val_acc: 0.9000 Epoch 31/100 10/10 [==============================] - ETA: 0s - loss: 1.0620 - acc: 0.9000 - val_loss: 1.0243 - val_acc: 0.9000 Epoch 32/100 10/10 [==============================] - ETA: 0s - loss: 1.0269 - acc: 0.9000 - val_loss: 0.9905 - val_acc: 1.0000 Epoch 33/100 10/10 [==============================] - ETA: 0s - loss: 0.9932 - acc: 1.0000 - val_loss: 0.9571 - val_acc: 1.0000 Epoch 34/100 10/10 [==============================] - ETA: 0s - loss: 0.9597 - acc: 1.0000 - val_loss: 0.9247 - val_acc: 1.0000 Epoch 35/100 10/10 [==============================] - ETA: 0s - loss: 0.9276 - acc: 1.0000 - val_loss: 0.8934 - val_acc: 1.0000 Epoch 36/100 10/10 [==============================] - ETA: 0s - loss: 0.8962 - acc: 1.0000 - val_loss: 0.8629 - val_acc: 1.0000 Epoch 37/100 10/10 [==============================] - ETA: 0s - loss: 0.8655 - acc: 1.0000 - val_loss: 0.8330 - val_acc: 1.0000 Epoch 38/100 10/10 [==============================] - ETA: 0s - loss: 0.8359 - acc: 1.0000 - val_loss: 0.8047 - val_acc: 1.0000 Epoch 39/100 10/10 [==============================] - ETA: 0s - loss: 0.8077 - acc: 1.0000 - val_loss: 0.7766 - val_acc: 1.0000 Epoch 40/100 10/10 [==============================] - ETA: 0s - loss: 0.7797 - acc: 1.0000 - val_loss: 0.7501 - val_acc: 1.0000 Epoch 41/100 10/10 [==============================] - ETA: 0s - loss: 0.7529 - acc: 1.0000 - val_loss: 0.7239 - val_acc: 1.0000 Epoch 42/100 10/10 [==============================] - ETA: 0s - loss: 0.7270 - acc: 1.0000 - val_loss: 0.6988 - val_acc: 1.0000 Epoch 43/100 10/10 [==============================] - ETA: 0s - loss: 0.7018 - acc: 1.0000 - val_loss: 0.6744 - val_acc: 1.0000 Epoch 44/100 10/10 [==============================] - ETA: 0s - loss: 0.6773 - acc: 1.0000 - val_loss: 0.6511 - val_acc: 1.0000 Epoch 45/100 10/10 [==============================] - ETA: 0s - loss: 0.6542 - acc: 1.0000 - val_loss: 0.6283 - val_acc: 1.0000 Epoch 46/100 10/10 [==============================] - ETA: 0s - loss: 0.6312 - acc: 1.0000 - val_loss: 0.6065 - val_acc: 1.0000 Epoch 47/100 10/10 [==============================] - ETA: 0s - loss: 0.6097 - acc: 1.0000 - val_loss: 0.5852 - val_acc: 1.0000 Epoch 48/100 10/10 [==============================] - ETA: 0s - loss: 0.5882 - acc: 1.0000 - val_loss: 0.5647 - val_acc: 1.0000 Epoch 49/100 10/10 [==============================] - ETA: 0s - loss: 0.5677 - acc: 1.0000 - val_loss: 0.5451 - val_acc: 1.0000 Epoch 50/100 10/10 [==============================] - ETA: 0s - loss: 0.5482 - acc: 1.0000 - val_loss: 0.5261 - val_acc: 1.0000 Epoch 51/100 10/10 [==============================] - ETA: 0s - loss: 0.5291 - acc: 1.0000 - val_loss: 0.5078 - val_acc: 1.0000 Epoch 52/100 10/10 [==============================] - ETA: 0s - loss: 0.5108 - acc: 1.0000 - val_loss: 0.4900 - val_acc: 1.0000 Epoch 53/100 10/10 [==============================] - ETA: 0s - loss: 0.4928 - acc: 1.0000 - val_loss: 0.4729 - val_acc: 1.0000 Epoch 54/100 10/10 [==============================] - ETA: 0s - loss: 0.4759 - acc: 1.0000 - val_loss: 0.4567 - val_acc: 1.0000 Epoch 55/100 10/10 [==============================] - ETA: 0s - loss: 0.4596 - acc: 1.0000 - val_loss: 0.4409 - val_acc: 1.0000 Epoch 56/100 10/10 [==============================] - ETA: 0s - loss: 0.4440 - acc: 1.0000 - val_loss: 0.4258 - val_acc: 1.0000 Epoch 57/100 10/10 [==============================] - ETA: 0s - loss: 0.4286 - acc: 1.0000 - val_loss: 0.4113 - val_acc: 1.0000 Epoch 58/100 10/10 [==============================] - ETA: 0s - loss: 0.4143 - acc: 1.0000 - val_loss: 0.3971 - val_acc: 1.0000 Epoch 59/100 10/10 [==============================] - ETA: 0s - loss: 0.4001 - acc: 1.0000 - val_loss: 0.3835 - val_acc: 1.0000 Epoch 60/100 10/10 [==============================] - ETA: 0s - loss: 0.3864 - acc: 1.0000 - val_loss: 0.3706 - val_acc: 1.0000 Epoch 61/100 10/10 [==============================] - ETA: 0s - loss: 0.3735 - acc: 1.0000 - val_loss: 0.3580 - val_acc: 1.0000 Epoch 62/100 10/10 [==============================] - ETA: 0s - loss: 0.3611 - acc: 1.0000 - val_loss: 0.3459 - val_acc: 1.0000 Epoch 63/100 10/10 [==============================] - ETA: 0s - loss: 0.3487 - acc: 1.0000 - val_loss: 0.3343 - val_acc: 1.0000 Epoch 64/100 10/10 [==============================] - ETA: 0s - loss: 0.3370 - acc: 1.0000 - val_loss: 0.3232 - val_acc: 1.0000 Epoch 65/100 10/10 [==============================] - ETA: 0s - loss: 0.3261 - acc: 1.0000 - val_loss: 0.3125 - val_acc: 1.0000 Epoch 66/100 10/10 [==============================] - ETA: 0s - loss: 0.3152 - acc: 1.0000 - val_loss: 0.3023 - val_acc: 1.0000 Epoch 67/100 10/10 [==============================] - ETA: 0s - loss: 0.3049 - acc: 1.0000 - val_loss: 0.2925 - val_acc: 1.0000 Epoch 68/100 10/10 [==============================] - ETA: 0s - loss: 0.2953 - acc: 1.0000 - val_loss: 0.2828 - val_acc: 1.0000 Epoch 69/100 10/10 [==============================] - ETA: 0s - loss: 0.2854 - acc: 1.0000 - val_loss: 0.2736 - val_acc: 1.0000 Epoch 70/100 10/10 [==============================] - ETA: 0s - loss: 0.2762 - acc: 1.0000 - val_loss: 0.2648 - val_acc: 1.0000 Epoch 71/100 10/10 [==============================] - ETA: 0s - loss: 0.2675 - acc: 1.0000 - val_loss: 0.2563 - val_acc: 1.0000 Epoch 72/100 10/10 [==============================] - ETA: 0s - loss: 0.2588 - acc: 1.0000 - val_loss: 0.2481 - val_acc: 1.0000 Epoch 73/100 10/10 [==============================] - ETA: 0s - loss: 0.2506 - acc: 1.0000 - val_loss: 0.2403 - val_acc: 1.0000 Epoch 74/100 10/10 [==============================] - ETA: 0s - loss: 0.2429 - acc: 1.0000 - val_loss: 0.2327 - val_acc: 1.0000 Epoch 75/100 10/10 [==============================] - ETA: 0s - loss: 0.2351 - acc: 1.0000 - val_loss: 0.2254 - val_acc: 1.0000 Epoch 76/100 10/10 [==============================] - ETA: 0s - loss: 0.2279 - acc: 1.0000 - val_loss: 0.2184 - val_acc: 1.0000 Epoch 77/100 10/10 [==============================] - ETA: 0s - loss: 0.2209 - acc: 1.0000 - val_loss: 0.2117 - val_acc: 1.0000 Epoch 78/100 10/10 [==============================] - ETA: 0s - loss: 0.2140 - acc: 1.0000 - val_loss: 0.2053 - val_acc: 1.0000 Epoch 79/100 10/10 [==============================] - ETA: 0s - loss: 0.2078 - acc: 1.0000 - val_loss: 0.1989 - val_acc: 1.0000 Epoch 80/100 10/10 [==============================] - ETA: 0s - loss: 0.2011 - acc: 1.0000 - val_loss: 0.1929 - val_acc: 1.0000 Epoch 81/100 10/10 [==============================] - ETA: 0s - loss: 0.1954 - acc: 1.0000 - val_loss: 0.1872 - val_acc: 1.0000 Epoch 82/100 10/10 [==============================] - ETA: 0s - loss: 0.1894 - acc: 1.0000 - val_loss: 0.1817 - val_acc: 1.0000 Epoch 83/100 10/10 [==============================] - ETA: 0s - loss: 0.1841 - acc: 1.0000 - val_loss: 0.1763 - val_acc: 1.0000 Epoch 84/100 10/10 [==============================] - ETA: 0s - loss: 0.1784 - acc: 1.0000 - val_loss: 0.1711 - val_acc: 1.0000 Epoch 85/100 10/10 [==============================] - ETA: 0s - loss: 0.1732 - acc: 1.0000 - val_loss: 0.1662 - val_acc: 1.0000 Epoch 86/100 10/10 [==============================] - ETA: 0s - loss: 0.1684 - acc: 1.0000 - val_loss: 0.1614 - val_acc: 1.0000 Epoch 87/100 10/10 [==============================] - ETA: 0s - loss: 0.1636 - acc: 1.0000 - val_loss: 0.1568 - val_acc: 1.0000 Epoch 88/100 10/10 [==============================] - ETA: 0s - loss: 0.1590 - acc: 1.0000 - val_loss: 0.1523 - val_acc: 1.0000 Epoch 89/100 10/10 [==============================] - ETA: 0s - loss: 0.1542 - acc: 1.0000 - val_loss: 0.1480 - val_acc: 1.0000 Epoch 90/100 10/10 [==============================] - ETA: 0s - loss: 0.1501 - acc: 1.0000 - val_loss: 0.1438 - val_acc: 1.0000 Epoch 91/100 10/10 [==============================] - ETA: 0s - loss: 0.1457 - acc: 1.0000 - val_loss: 0.1399 - val_acc: 1.0000 Epoch 92/100 10/10 [==============================] - ETA: 0s - loss: 0.1418 - acc: 1.0000 - val_loss: 0.1360 - val_acc: 1.0000 Epoch 93/100 10/10 [==============================] - ETA: 0s - loss: 0.1379 - acc: 1.0000 - val_loss: 0.1323 - val_acc: 1.0000 Epoch 94/100 10/10 [==============================] - ETA: 0s - loss: 0.1343 - acc: 1.0000 - val_loss: 0.1287 - val_acc: 1.0000 Epoch 95/100 10/10 [==============================] - ETA: 0s - loss: 0.1306 - acc: 1.0000 - val_loss: 0.1252 - val_acc: 1.0000 Epoch 96/100 10/10 [==============================] - ETA: 0s - loss: 0.1271 - acc: 1.0000 - val_loss: 0.1218 - val_acc: 1.0000 Epoch 97/100 10/10 [==============================] - ETA: 0s - loss: 0.1237 - acc: 1.0000 - val_loss: 0.1185 - val_acc: 1.0000 Epoch 98/100 10/10 [==============================] - ETA: 0s - loss: 0.1204 - acc: 1.0000 - val_loss: 0.1154 - val_acc: 1.0000 Epoch 99/100 10/10 [==============================] - ETA: 0s - loss: 0.1172 - acc: 1.0000 - val_loss: 0.1124 - val_acc: 1.0000 Epoch 100/100 10/10 [==============================] - ETA: 0s - loss: 0.1141 - acc: 1.0000 - val_loss: 0.1095 - val_acc: 1.0000 Test loss: 0.109465420246 Test accuracy: 1.0 ``` ## 輸出的圖片 plot_model 的執行,需要安裝 pydot 和 graphviz ``` pip3 install pydot ``` ## plot_model 的環境安裝 在 windows 上,graphviz 必須手動安裝,安裝檔可以到他們官網下載。 [http://www.graphviz.org/Download_windows.php](http://www.graphviz.org/Download_windows.php) 安裝好之後必須在環境變數 path 加上 C:\Program Files (x86)\Graphviz2.30\bin 才能使用。 cmd 下輸入指令: ``` dot -version ``` 若顯示以下訊息,表示正確安裝。 ``` dot - graphviz version 2.30.1 (20130214.1330) libdir = "C:\Program Files (x86)\Graphviz2.30\bin" Activated plugin library: gvplugin_pango.dll Using textlayout: textlayout:cairo Activated plugin library: gvplugin_dot_layout.dll Using layout: dot:dot_layout Activated plugin library: gvplugin_core.dll Using render: dot:core Using device: dot:dot:core The plugin configuration file: C:\Program Files (x86)\Graphviz2.30\bin\config6 was successfully loaded. render : cairo dot fig gd gdiplus map pic pov ps svg tk vml vrml xdot layout : circo dot fdp neato nop nop1 nop2 osage patchwork sfdp twopi textlayout : textlayout device : bmp canon cmap cmapx cmapx_np dot emf emfplus eps fig gd gd2 gif gv imap imap_np ismap jpe jpeg jpg metafile pdf p ic plain plain-ext png pov ps ps2 svg svgz tif tiff tk vml vmlz vrml wbmp xdot loadimage : (lib) bmp eps gd gd2 gif jpe jpeg jpg png ps svg ``` ## 參考文件 plot_model 在 windows 上的環境建立:[https://zhuanlan.zhihu.com/p/28158957](https://zhuanlan.zhihu.com/p/28158957) graphviz 的使用:[https://www.openfoundry.org/tw/foss-programs/8820-graphviz-](https://www.openfoundry.org/tw/foss-programs/8820-graphviz-)

2017/10/17

Python - draw text on image and image to numpy array

markdown ## 目標 嘗試生成以微軟正黑體寫成的數字0~9並轉換成 numpy array ##本文包含 * 生成圖片和保存圖片 * 在圖片上寫出指定字型和大小的字 * 設定反鋸齒模式 * 圖片轉成 numpy array ## 程式碼 ``` import numpy from PIL import Image from PIL import ImageFont from PIL import ImageDraw image_size = (8,13) font_size = 10 for i in range(10): # 空白圖片生成 image = Image.new('L', image_size, 0) # 取得繪圖器 draw = ImageDraw.Draw(image) # 微軟正黑體 font = ImageFont.truetype("C:/Windows/Fonts/msjh.ttc", font_size) # 關閉反鋸齒 draw.fontmode = '1' # 測量文字尺寸 text_size = draw.textsize(str(i),font) # print('text_size:', text_size) # 文字置中 text_position = ((image_size[0]-text_size[0])//2,(image_size[1]-text_size[1])//2) # print('text_position:', text_position) # 畫上文字 draw.text(text_position, str(i), 255, font) # 存檔 image.save(str(i)+'.bmp') # 轉成 numpy array na = numpy.array(image.getdata()).reshape(image.size[1], image.size[0]) # 印出 print(na) ``` ## 輸出 ``` Using TensorFlow backend. [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 255 255 255 255 0 0] [ 0 0 255 0 0 255 255 0] [ 0 255 0 0 0 0 255 0] [ 0 255 0 0 0 0 255 0] [ 0 255 0 0 0 0 255 0] [ 0 255 0 0 0 0 255 0] [ 0 255 255 0 0 255 0 0] [ 0 0 255 255 255 255 0 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 255 255 0 0 0] [ 0 0 255 255 255 0 0 0] [ 0 0 0 0 255 0 0 0] [ 0 0 0 0 255 0 0 0] [ 0 0 0 0 255 0 0 0] [ 0 0 0 0 255 0 0 0] [ 0 0 0 0 255 0 0 0] [ 0 0 255 255 255 255 255 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 255 255 255 0 0] [ 0 0 255 0 0 0 255 0] [ 0 0 0 0 0 0 255 0] [ 0 0 0 0 0 0 255 0] [ 0 0 0 0 0 255 0 0] [ 0 0 0 0 255 0 0 0] [ 0 0 0 255 0 0 0 0] [ 0 0 255 255 255 255 255 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 255 255 0 0 0] [ 0 0 255 0 0 255 0 0] [ 0 0 0 0 0 255 0 0] [ 0 0 0 255 255 0 0 0] [ 0 0 0 0 0 255 0 0] [ 0 0 0 0 0 255 0 0] [ 0 0 255 0 0 255 0 0] [ 0 0 255 255 255 0 0 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 255 255 0 0] [ 0 0 0 0 255 255 0 0] [ 0 0 0 255 0 255 0 0] [ 0 0 255 0 0 255 0 0] [ 0 255 0 0 0 255 0 0] [ 0 255 255 255 255 255 255 0] [ 0 0 0 0 0 255 0 0] [ 0 0 0 0 0 255 0 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 255 255 255 255 0 0] [ 0 0 255 0 0 0 0 0] [ 0 0 255 0 0 0 0 0] [ 0 0 255 255 255 0 0 0] [ 0 0 0 0 0 255 0 0] [ 0 0 0 0 0 255 0 0] [ 0 0 255 0 0 255 0 0] [ 0 0 255 255 255 0 0 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 255 255 255 0] [ 0 0 0 255 0 0 0 0] [ 0 0 255 0 0 0 0 0] [ 0 0 255 0 255 255 0 0] [ 0 0 255 255 0 0 255 0] [ 0 0 255 0 0 0 255 0] [ 0 0 255 0 0 0 255 0] [ 0 0 0 255 255 255 0 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 255 255 255 255 255 255 0] [ 0 0 0 0 0 255 0 0] [ 0 0 0 0 0 255 0 0] [ 0 0 0 0 255 0 0 0] [ 0 0 0 0 255 0 0 0] [ 0 0 0 255 0 0 0 0] [ 0 0 0 255 0 0 0 0] [ 0 0 255 0 0 0 0 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 255 255 0 0 0] [ 0 0 255 0 0 255 0 0] [ 0 0 255 0 0 255 0 0] [ 0 0 0 255 255 0 0 0] [ 0 0 0 255 0 255 0 0] [ 0 0 255 0 0 0 255 0] [ 0 0 255 0 0 0 255 0] [ 0 0 0 255 255 255 0 0] [ 0 0 0 0 0 0 0 0]] [[ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 0 0 0 0 0] [ 0 0 0 255 255 255 0 0] [ 0 0 255 0 0 0 255 0] [ 0 0 255 0 0 0 255 0] [ 0 0 255 0 0 0 255 0] [ 0 0 0 255 255 255 255 0] [ 0 0 0 0 0 0 255 0] [ 0 0 0 0 0 255 0 0] [ 0 0 255 255 255 0 0 0] [ 0 0 0 0 0 0 0 0]] ``` ## 參考文件 PIL影像相關:[http://pillow.readthedocs.io/en/stable/reference/index.html](http://pillow.readthedocs.io/en/stable/reference/index.html)