[問題] Keras預測問題

看板DataScience作者 (加菲貓星人)時間6年前 (2018/06/01 00:17), 6年前編輯推噓5(5010)
留言15則, 5人參與, 6年前最新討論串1/1
各位大大好~ 小弟學過DM的一些方法 正在嘗試Keras預測 目前遇到的問題是 我想用25個欄位去預測Y值 傳統Dm的方法是,將新的資料吃進訓練好的模型中,讓他預測新的Y 但在Keras裡,他預測出來的y是array形式 想請問這樣的話,我要如何將新的資料轉成訓練模型可吃的形式,謝謝! 以下是我的程式碼: import numpy import pandas as pd from sklearn import preprocessing numpy.random.seed(10) all_df = pd.read_csv("/Users/mac/Desktop/123.csv") cols=['x1','x2','x3','x4','x5','x6','x7','x8','x9','x10', 'x11','x12','x13','x14','x15','x16','x17','x18','x19', 'x20','x21','x22','x23','x24','x25'] #欄位名稱 all_df = all_df[cols] msk=numpy.random.rand(len(all_df)) < 0.8 train_df = all_df[msk] test_df=all_df[~msk] train_Features=all_df[['x1','x2','x3','x4','x5','x6','x7','x8','x9', 'x10','x11','x12','x13','x14','x15','x16','x17','x18','x19', 'x20','x21','x22','x23','x24','x25']] train_Label = all_df['x25'] test_Features=all_df[['x1','x2','x3','x4','x5', 'x6','x7','x8','x9','x10','x11','x12','x13','x14', 'x15','x16','x17','x18','x19','x20','x21','x22','x23','x24','x25']] test_Label = all_df['x25'] print(len(train_df)) print(len(test_df)) print(len(all_df)) from keras.models import Sequential from keras.layers import Dense,Dropout model = Sequential() model.add(Dense(units=40,input_dim=25, kernel_initializer= 'uniform', activation = 'relu')) model.add(Dense(units=30, kernel_initializer= 'uniform', activation = 'relu')) model.add(Dense(units=1, kernel_initializer= 'uniform', activation = 'sigmoid')) model.compile(loss='binary_crossentropy', optimizer = 'adam',metrics=['accuracy']) train_history = model.fit(x=train_Features, y=train_Label, validation_split=0.1, epochs=30, batch_size=30,verbose=2) train_history scores = model.evaluate(x=test_Features,y=test_Label) scores[1] 以下是我的模型結果: runfile('/Users/mac/.spyder-py3/temp.py', wdir='/Users/mac/.spyder-py3') 74 16 90 Train on 81 samples, validate on 9 samples Epoch 1/30 - 1s - loss: 0.6929 - acc: 0.4198 - val_loss: 0.6937 - val_acc: 0.1111 Epoch 2/30 - 0s - loss: 0.6902 - acc: 0.1852 - val_loss: 0.6944 - val_acc: 0.1111 Epoch 3/30 - 0s - loss: 0.6877 - acc: 0.1605 - val_loss: 0.6951 - val_acc: 0.1111 Epoch 4/30 - 0s - loss: 0.6851 - acc: 0.1605 - val_loss: 0.6957 - val_acc: 0.1111 Epoch 5/30 - 0s - loss: 0.6813 - acc: 0.1605 - val_loss: 0.6963 - val_acc: 0.1111 Epoch 6/30 - 0s - loss: 0.6767 - acc: 0.1852 - val_loss: 0.6970 - val_acc: 0.1111 Epoch 7/30 - 0s - loss: 0.6708 - acc: 0.2099 - val_loss: 0.6975 - val_acc: 0.1111 Epoch 8/30 - 0s - loss: 0.6628 - acc: 0.2222 - val_loss: 0.6979 - val_acc: 0.1111 Epoch 9/30 - 0s - loss: 0.6534 - acc: 0.3210 - val_loss: 0.6984 - val_acc: 0.1111 Epoch 10/30 - 0s - loss: 0.6397 - acc: 0.3580 - val_loss: 0.6986 - val_acc: 0.2222 Epoch 11/30 - 0s - loss: 0.6244 - acc: 0.4321 - val_loss: 0.6990 - val_acc: 0.2222 Epoch 12/30 - 0s - loss: 0.6039 - acc: 0.4815 - val_loss: 0.6990 - val_acc: 0.2222 Epoch 13/30 - 0s - loss: 0.5758 - acc: 0.5309 - val_loss: 0.6988 - val_acc: 0.2222 Epoch 14/30 - 0s - loss: 0.5467 - acc: 0.5432 - val_loss: 0.6990 - val_acc: 0.2222 Epoch 15/30 - 0s - loss: 0.5088 - acc: 0.5432 - val_loss: 0.6991 - val_acc: 0.2222 Epoch 16/30 - 0s - loss: 0.4600 - acc: 0.5432 - val_loss: 0.6986 - val_acc: 0.3333 Epoch 17/30 - 0s - loss: 0.4149 - acc: 0.5556 - val_loss: 0.6988 - val_acc: 0.3333 Epoch 18/30 - 0s - loss: 0.3513 - acc: 0.5679 - val_loss: 0.6993 - val_acc: 0.4444 Epoch 19/30 - 0s - loss: 0.2774 - acc: 0.5556 - val_loss: 0.6992 - val_acc: 0.4444 Epoch 20/30 - 0s - loss: 0.2010 - acc: 0.5556 - val_loss: 0.7004 - val_acc: 0.4444 Epoch 21/30 - 0s - loss: 0.1163 - acc: 0.5556 - val_loss: 0.7034 - val_acc: 0.4444 Epoch 22/30 - 0s - loss: 0.0139 - acc: 0.5556 - val_loss: 0.7056 - val_acc: 0.4444 Epoch 23/30 - 0s - loss: -8.1930e-02 - acc: 0.5679 - val_loss: 0.7121 - val_acc: 0.4444 Epoch 24/30 - 0s - loss: -1.9559e-01 - acc: 0.5679 - val_loss: 0.7214 - val_acc: 0.4444 Epoch 25/30 - 0s - loss: -3.2348e-01 - acc: 0.5679 - val_loss: 0.7327 - val_acc: 0.4444 Epoch 26/30 - 0s - loss: -4.4836e-01 - acc: 0.5802 - val_loss: 0.7467 - val_acc: 0.4444 Epoch 27/30 - 0s - loss: -5.7915e-01 - acc: 0.5802 - val_loss: 0.7694 - val_acc: 0.4444 Epoch 28/30 - 0s - loss: -7.3865e-01 - acc: 0.5802 - val_loss: 0.7944 - val_acc: 0.4444 Epoch 29/30 - 0s - loss: -8.9148e-01 - acc: 0.5802 - val_loss: 0.8236 - val_acc: 0.4444 Epoch 30/30 - 0s - loss: -1.0620e+00 - acc: 0.5802 - val_loss: 0.8666 - val_acc: 0.4444 90/90 [==============================] - 0s 49us/step -- ※ 發信站: 批踢踢實業坊(ptt.cc), 來自: 119.14.41.117 ※ 文章網址: https://www.ptt.cc/bbs/DataScience/M.1527783435.A.9A9.html

06/01 09:18, 6年前 , 1F
你是模型已經train好,要用新的資料來進行預測嗎?
06/01 09:18, 1F

06/01 11:02, 6年前 , 2F
你的LSTM預測的結果長怎樣?能否秀出程式?
06/01 11:02, 2F

06/01 13:56, 6年前 , 3F
你要做的事many-to-one還是many-to-many的預測?
06/01 13:56, 3F

06/01 13:57, 6年前 , 4F
06/01 13:57, 4F
※ 編輯: taylor0607 (27.246.68.149), 06/01/2018 15:04:55

06/01 15:05, 6年前 , 5F
好的 補上了
06/01 15:05, 5F

06/01 15:05, 6年前 , 6F
對 我想像DM一樣 預測出一個新欄位
06/01 15:05, 6F

06/01 15:06, 6年前 , 7F
回t大 我是many to one
06/01 15:06, 7F

06/01 16:42, 6年前 , 8F
我怎麼沒看到LSTM在哪裡 你import 的不是Dense嗎
06/01 16:42, 8F
※ 編輯: taylor0607 (27.246.68.149), 06/01/2018 17:04:09

06/01 17:04, 6年前 , 9F
啊抱歉 是Keras
06/01 17:04, 9F

06/01 17:18, 6年前 , 10F
你模型訓練時是怎樣測試就怎樣進去阿 scores沒東西嗎?
06/01 17:18, 10F

06/01 17:54, 6年前 , 11F
你要問的是怎麼predict test data嗎?
06/01 17:54, 11F

06/01 17:58, 6年前 , 12F
Keras直接用model.predict就可以了呀 官方文件有參數說
06/01 17:58, 12F

06/01 17:58, 6年前 , 13F
06/01 17:58, 13F

06/01 17:58, 6年前 , 14F
或是參考這篇 https://tinyurl.com/yc24oevc
06/01 17:58, 14F

06/01 19:15, 6年前 , 15F
好 謝謝~
06/01 19:15, 15F
文章代碼(AID): #1R420Bcf (DataScience)
文章代碼(AID): #1R420Bcf (DataScience)