在神經網絡訓練中,我們常常需要畫出loss function的變化圖,log日志里會顯示每一次迭代的loss function的值,于是我們先把log日志保存為log.txt文檔,再利用這個文檔來畫圖。
1,先來產生一個log日志。
import mxnet as mx import numpy as np import os import logging logging.getLogger().setLevel(logging.DEBUG) # Training data logging.basicConfig(filename = os.path.join(os.getcwd(), 'log.txt'), level = logging.DEBUG) # 把log日志保存為log.txt train_data = np.random.uniform(0, 1, [100, 2]) train_label = np.array([train_data[i][0] + 2 * train_data[i][1] for i in range(100)]) batch_size = 1 num_epoch=5 # Evaluation Data eval_data = np.array([[7,2],[6,10],[12,2]]) eval_label = np.array([11,26,16]) train_iter = mx.io.NDArrayIter(train_data,train_label, batch_size, shuffle=True,label_name='lin_reg_label') eval_iter = mx.io.NDArrayIter(eval_data, eval_label, batch_size, shuffle=False) X = mx.sym.Variable('data') Y = mx.sym.Variable('lin_reg_label') fully_connected_layer = mx.sym.FullyConnected(data=X, name='fc1', num_hidden = 1) lro = mx.sym.LinearRegressionOutput(data=fully_connected_layer, label=Y, name="lro") model = mx.mod.Module( symbol = lro , data_names=['data'], label_names = ['lin_reg_label'] # network structure ) model.fit(train_iter, eval_iter, optimizer_params={'learning_rate':0.005, 'momentum': 0.9}, num_epoch=20, eval_metric='mse',) model.predict(eval_iter).asnumpy() metric = mx.metric.MSE() model.score(eval_iter, metric)
標題名稱:python保存log日志,實現用log日志畫圖-創新互聯
鏈接URL:http://www.yijiale78.com/article38/pcesp.html
成都網站建設公司_創新互聯,為您提供用戶體驗、外貿網站建設、網站維護、域名注冊、品牌網站設計、Google
聲明:本網站發布的內容(圖片、視頻和文字)以用戶投稿、用戶轉載內容為主,如果涉及侵權請盡快告知,我們將會在第一時間刪除。文章觀點不代表本網站立場,如需處理請聯系客服。電話:028-86922220;郵箱:631063699@qq.com。內容未經允許不得轉載,或轉載時需注明來源: 創新互聯