时间序列预测——双向LSTM(Bi-LSTM)「建议收藏」

时间序列预测——双向LSTM(Bi-LSTM)「建议收藏」  本文展示了使用双向LSTM(Bi-LSTM)进行时间序列预测的全过程,包含详细的注释。整个过程主要包括:数据导入、数据清洗、结构转化、建立Bi-LSTM模型、训练模型(包括动态调整学习率和earlystopping的设置)、预测、结果展示、误差评估等完整的时间序列预测流程。  本文使用的数据集在本人上传的资源中,链接为mock_kaggle.csv代码如下:importpandasaspdimportnumpyasnpimportmathimportkerasfromma

大家好,又见面了,我是你们的朋友全栈君。

  本文展示了使用双向LSTM(Bi-LSTM)进行时间序列预测的全过程,包含详细的注释。整个过程主要包括:数据导入、数据清洗、结构转化、建立Bi-LSTM模型、训练模型(包括动态调整学习率和earlystopping的设置)、预测、结果展示、误差评估等完整的时间序列预测流程。
  本文使用的数据集在本人上传的资源中,链接为mock_kaggle.csv

代码如下:

import pandas as pd
import numpy as np
import math
import keras
from matplotlib import pyplot as plt
from matplotlib.pylab import mpl
import tensorflow as tf
from sklearn.preprocessing import MinMaxScaler
from keras import backend as K
from keras.layers import LeakyReLU
from sklearn.metrics import mean_squared_error # 均方误差
from keras.callbacks import LearningRateScheduler
from keras.callbacks import EarlyStopping
from tensorflow.keras import Input, Model,Sequential
from keras.layers import Bidirectional#, Concatenate
mpl.rcParams['font.sans-serif'] = ['SimHei']   #显示中文
mpl.rcParams['axes.unicode_minus']=False       #显示负号

取数据

data=pd.read_csv('mock_kaggle.csv',encoding ='gbk',parse_dates=['datetime'])
Date=pd.to_datetime(data.datetime)
data['date'] = Date.map(lambda x: x.strftime('%Y-%m-%d'))
datanew=data.set_index(Date)
series = pd.Series(datanew['股票'].values, index=datanew['date'])
series
date
2014-01-01    4972
2014-01-02    4902
2014-01-03    4843
2014-01-04    4750
2014-01-05    4654
              ... 
2016-07-27    3179
2016-07-28    3071
2016-07-29    4095
2016-07-30    3825
2016-07-31    3642
Length: 937, dtype: int64

滞后扩充数据

dataframe1 = pd.DataFrame()
num_hour = 16
for i in range(num_hour,0,-1):
    dataframe1['t-'+str(i)] = series.shift(i)
dataframe1['t'] = series.values
dataframe3=dataframe1.dropna()
dataframe3.index=range(len(dataframe3))
dataframe3
t-16 t-15 t-14 t-13 t-12 t-11 t-10 t-9 t-8 t-7 t-6 t-5 t-4 t-3 t-2 t-1 t
0 4972.0 4902.0 4843.0 4750.0 4654.0 4509.0 4329.0 4104.0 4459.0 5043.0 5239.0 5118.0 4984.0 4904.0 4822.0 4728.0 4464
1 4902.0 4843.0 4750.0 4654.0 4509.0 4329.0 4104.0 4459.0 5043.0 5239.0 5118.0 4984.0 4904.0 4822.0 4728.0 4464.0 4265
2 4843.0 4750.0 4654.0 4509.0 4329.0 4104.0 4459.0 5043.0 5239.0 5118.0 4984.0 4904.0 4822.0 4728.0 4464.0 4265.0 4161
3 4750.0 4654.0 4509.0 4329.0 4104.0 4459.0 5043.0 5239.0 5118.0 4984.0 4904.0 4822.0 4728.0 4464.0 4265.0 4161.0 4091
4 4654.0 4509.0 4329.0 4104.0 4459.0 5043.0 5239.0 5118.0 4984.0 4904.0 4822.0 4728.0 4464.0 4265.0 4161.0 4091.0 3964
916 1939.0 1967.0 1670.0 1532.0 1343.0 1022.0 813.0 1420.0 1359.0 1075.0 1015.0 917.0 1550.0 1420.0 1358.0 2893.0 3179
917 1967.0 1670.0 1532.0 1343.0 1022.0 813.0 1420.0 1359.0 1075.0 1015.0 917.0 1550.0 1420.0 1358.0 2893.0 3179.0 3071
918 1670.0 1532.0 1343.0 1022.0 813.0 1420.0 1359.0 1075.0 1015.0 917.0 1550.0 1420.0 1358.0 2893.0 3179.0 3071.0 4095
919 1532.0 1343.0 1022.0 813.0 1420.0 1359.0 1075.0 1015.0 917.0 1550.0 1420.0 1358.0 2893.0 3179.0 3071.0 4095.0 3825
920 1343.0 1022.0 813.0 1420.0 1359.0 1075.0 1015.0 917.0 1550.0 1420.0 1358.0 2893.0 3179.0 3071.0 4095.0 3825.0 3642

921 rows × 17 columns

二折划分数据并标准化

pd.DataFrame(np.random.shuffle(dataframe3.values))  #shuffle
pot=len(dataframe3)-12
train=dataframe3[:pot]
test=dataframe3[pot:]
scaler = MinMaxScaler(feature_range=(0, 1)).fit(train)
#scaler = preprocessing.StandardScaler().fit(train)
train_norm=pd.DataFrame(scaler.fit_transform(train))
test_norm=pd.DataFrame(scaler.transform(test))
test_norm.shape,train_norm.shape
((12, 17), (909, 17))
X_train=train_norm.iloc[:,:-1]
X_test=test_norm.iloc[:,:-1]
Y_train=train_norm.iloc[:,-1:]
Y_test=test_norm.iloc[:,-1:]

转换为3维数据 [samples, timesteps, features]

source_x_train=X_train
source_x_test=X_test
X_train=X_train.values.reshape([X_train.shape[0],2,8]) #从(909, 16)-->(909, 2,8)
X_test=X_test.values.reshape([X_test.shape[0],2,8])  #从(12, 16)-->(12, 2,8)
Y_train=Y_train.values
Y_test=Y_test.values
X_train.shape,Y_train.shape
((909, 2, 8), (909, 1))
X_test.shape,Y_test.shape
((12, 2, 8), (12, 1))

动态调整学习率与提前终止函数

def scheduler(epoch):
    # 每隔50个epoch,学习率减小为原来的1/10
    if epoch % 50 == 0 and epoch != 0:
        lr = K.get_value(bilstm.optimizer.lr)
        if lr>1e-5:
            K.set_value(bilstm.optimizer.lr, lr * 0.1)
            print("lr changed to {}".format(lr * 0.1))
    return K.get_value(bilstm.optimizer.lr)

reduce_lr = LearningRateScheduler(scheduler)
early_stopping = EarlyStopping(monitor='loss', 
                               patience=20, 
                               min_delta=1e-5,
                               mode='auto',
                               restore_best_weights=False,#是否从具有监测数量的最佳值的时期恢复模型权重
                               verbose=2)

构造Bi-LSTM模型

# 特征数
input_size = X_train.shape[2]
# 时间步长:用多少个时间步的数据来预测下一个时刻的值
time_steps = X_train.shape[1]
# 隐藏层block的个数
cell_size = 128
batch_size=24

bilstm = keras.Sequential()
bilstm.add(Bidirectional(keras.layers.LSTM(
        units = cell_size, # 输出维度
        batch_input_shape=(batch_size, time_steps, input_size),# 输入维度
        stateful=False, #保持状态
        ), merge_mode='concat'))
bilstm.add(keras.layers.Dense(64))
bilstm.add(keras.layers.LeakyReLU(alpha=0.3))
bilstm.add(keras.layers.Dense(32))
bilstm.add(keras.layers.LeakyReLU(alpha=0.3))
bilstm.add(keras.layers.Dense(16))
bilstm.add(keras.layers.LeakyReLU(alpha=0.3))
# 输出层
bilstm.add(keras.layers.Dense(1))
bilstm.add(keras.layers.LeakyReLU(alpha=0.3))
# 定义优化器
nadam = keras.optimizers.Nadam(lr=1e-3)
bilstm.compile(optimizer=nadam, loss='mse', metrics=['accuracy'])

训练

history=bilstm.fit(X_train,Y_train, epochs=80,batch_size=32,callbacks=[reduce_lr])
Epoch 1/80
909/909 [==============================] - 3s 3ms/step - loss: 0.0200 - accuracy: 0.0187
Epoch 2/80
909/909 [==============================] - 1s 594us/step - loss: 0.0071 - accuracy: 0.0187
Epoch 3/80
909/909 [==============================] - 1s 611us/step - loss: 0.0057 - accuracy: 0.0187
Epoch 4/80
909/909 [==============================] - 1s 781us/step - loss: 0.0038 - accuracy: 0.0187
Epoch 5/80
909/909 [==============================] - 1s 719us/step - loss: 0.0037 - accuracy: 0.0187
Epoch 6/80
909/909 [==============================] - 1s 741us/step - loss: 0.0035 - accuracy: 0.0187
Epoch 7/80
909/909 [==============================] - 1s 576us/step - loss: 0.0040 - accuracy: 0.0187
Epoch 8/80
909/909 [==============================] - 1s 686us/step - loss: 0.0033 - accuracy: 0.01870s - loss: 0.0033 - accuracy: 0.01
Epoch 9/80
909/909 [==============================] - 1s 727us/step - loss: 0.0032 - accuracy: 0.0187
Epoch 10/80
909/909 [==============================] - 1s 652us/step - loss: 0.0030 - accuracy: 0.0187
Epoch 11/80
909/909 [==============================] - 1s 610us/step - loss: 0.0033 - accuracy: 0.0187
Epoch 12/80
909/909 [==============================] - 1s 573us/step - loss: 0.0031 - accuracy: 0.0187
Epoch 13/80
909/909 [==============================] - 1s 666us/step - loss: 0.0029 - accuracy: 0.0187
Epoch 14/80
909/909 [==============================] - 1s 552us/step - loss: 0.0028 - accuracy: 0.0187
Epoch 15/80
909/909 [==============================] - 1s 718us/step - loss: 0.0030 - accuracy: 0.0187
Epoch 16/80
909/909 [==============================] - 1s 601us/step - loss: 0.0028 - accuracy: 0.0187
Epoch 17/80
909/909 [==============================] - 0s 541us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 18/80
909/909 [==============================] - 1s 657us/step - loss: 0.0027 - accuracy: 0.0187
Epoch 19/80
909/909 [==============================] - 1s 680us/step - loss: 0.0027 - accuracy: 0.0187
Epoch 20/80
909/909 [==============================] - 1s 703us/step - loss: 0.0028 - accuracy: 0.0187
Epoch 21/80
909/909 [==============================] - 1s 602us/step - loss: 0.0030 - accuracy: 0.0187
Epoch 22/80
909/909 [==============================] - 1s 622us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 23/80
909/909 [==============================] - 1s 700us/step - loss: 0.0028 - accuracy: 0.0187
Epoch 24/80
909/909 [==============================] - 1s 613us/step - loss: 0.0025 - accuracy: 0.0187
Epoch 25/80
909/909 [==============================] - 1s 569us/step - loss: 0.0028 - accuracy: 0.0187
Epoch 26/80
909/909 [==============================] - 0s 525us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 27/80
909/909 [==============================] - 0s 487us/step - loss: 0.0028 - accuracy: 0.0187
Epoch 28/80
909/909 [==============================] - 0s 493us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 29/80
909/909 [==============================] - 0s 494us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 30/80
909/909 [==============================] - 0s 490us/step - loss: 0.0027 - accuracy: 0.0187
Epoch 31/80
909/909 [==============================] - 0s 519us/step - loss: 0.0026 - accuracy: 0.01870s - loss: 0.0026 - accuracy: 0.
Epoch 32/80
909/909 [==============================] - 0s 494us/step - loss: 0.0027 - accuracy: 0.0187
Epoch 33/80
909/909 [==============================] - 0s 493us/step - loss: 0.0025 - accuracy: 0.0187
Epoch 34/80
909/909 [==============================] - 0s 500us/step - loss: 0.0025 - accuracy: 0.0187
Epoch 35/80
909/909 [==============================] - 0s 505us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 36/80
909/909 [==============================] - 1s 595us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 37/80
909/909 [==============================] - 1s 578us/step - loss: 0.0027 - accuracy: 0.01870s - loss: 0.0025 - accuracy: 
Epoch 38/80
909/909 [==============================] - 0s 518us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 39/80
909/909 [==============================] - 0s 525us/step - loss: 0.0024 - accuracy: 0.0187
Epoch 40/80
909/909 [==============================] - 0s 501us/step - loss: 0.0024 - accuracy: 0.0187
Epoch 41/80
909/909 [==============================] - 0s 500us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 42/80
909/909 [==============================] - 0s 529us/step - loss: 0.0023 - accuracy: 0.0187
Epoch 43/80
909/909 [==============================] - 1s 616us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 44/80
909/909 [==============================] - 1s 596us/step - loss: 0.0027 - accuracy: 0.0187
Epoch 45/80
909/909 [==============================] - 1s 582us/step - loss: 0.0024 - accuracy: 0.0187: 0s - loss: 0.0012 - accu
Epoch 46/80
909/909 [==============================] - 0s 508us/step - loss: 0.0025 - accuracy: 0.0187
Epoch 47/80
909/909 [==============================] - 1s 574us/step - loss: 0.0025 - accuracy: 0.0187
Epoch 48/80
909/909 [==============================] - 1s 724us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 49/80
909/909 [==============================] - 1s 696us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 50/80
909/909 [==============================] - 1s 667us/step - loss: 0.0026 - accuracy: 0.0187
Epoch 51/80
lr changed to 0.00010000000474974513
909/909 [==============================] - 1s 653us/step - loss: 0.0023 - accuracy: 0.0187
Epoch 52/80
909/909 [==============================] - 1s 703us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 53/80
909/909 [==============================] - 1s 616us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 54/80
909/909 [==============================] - 1s 650us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 55/80
909/909 [==============================] - 1s 648us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 56/80
909/909 [==============================] - 1s 661us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 57/80
909/909 [==============================] - 1s 718us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 58/80
909/909 [==============================] - 1s 687us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 59/80
909/909 [==============================] - 1s 628us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 60/80
909/909 [==============================] - 1s 725us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 61/80
909/909 [==============================] - 1s 697us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 62/80
909/909 [==============================] - 1s 768us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 63/80
909/909 [==============================] - 1s 834us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 64/80
909/909 [==============================] - 1s 755us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 65/80
909/909 [==============================] - 1s 666us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 66/80
909/909 [==============================] - 1s 561us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 67/80
909/909 [==============================] - 1s 565us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 68/80
909/909 [==============================] - 1s 565us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 69/80
909/909 [==============================] - 1s 558us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 70/80
909/909 [==============================] - 0s 542us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 71/80
909/909 [==============================] - 0s 545us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 72/80
909/909 [==============================] - 1s 612us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 73/80
909/909 [==============================] - 1s 647us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 74/80
909/909 [==============================] - 1s 765us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 75/80
909/909 [==============================] - 1s 664us/step - loss: 0.0022 - accuracy: 0.01870s - loss: 0.0024 - accuracy: 0.
Epoch 76/80
909/909 [==============================] - 1s 817us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 77/80
909/909 [==============================] - 1s 693us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 78/80
909/909 [==============================] - 1s 726us/step - loss: 0.0022 - accuracy: 0.0187TA: 0s - loss: 0.0018 - ac
Epoch 79/80
909/909 [==============================] - 1s 681us/step - loss: 0.0022 - accuracy: 0.0187
Epoch 80/80
909/909 [==============================] - 1s 562us/step - loss: 0.0022 - accuracy: 0.0187
history.history.keys() #查看history中存储了哪些参数
plt.plot(history.epoch,history.history.get('loss')) #画出随着epoch增大loss的变化图

在这里插入图片描述

预测

predict = bilstm.predict(X_test)
real_predict=scaler.inverse_transform(np.concatenate((source_x_test,predict),axis=1))
real_y=scaler.inverse_transform(np.concatenate((source_x_test,Y_test),axis=1))
real_predict=real_predict[:,-1]
real_y=real_y[:,-1]

误差评估

plt.figure(figsize=(15,6))
bwith = 0.75 #边框宽度设置为0.75
ax = plt.gca()#获取边框
ax.spines['bottom'].set_linewidth(bwith)
ax.spines['left'].set_linewidth(bwith)
ax.spines['top'].set_linewidth(bwith)
ax.spines['right'].set_linewidth(bwith)
plt.plot(real_predict,label='real_predict')
plt.plot(real_y,label='real_y')
plt.plot(real_y*(1+0.15),label='15%上限',linestyle='--',color='green')
plt.plot(real_y*(1-0.15),label='15%下限',linestyle='--',color='green')
plt.fill_between(range(0,12),real_y*(1+0.15),real_y*(1-0.15),color='gray',alpha=0.2)
plt.legend()
plt.show()

在这里插入图片描述

round(mean_squared_error(Y_test,predict),4)
0.0012
from sklearn.metrics import r2_score
round(r2_score(real_y,real_predict),4)
0.5152
per_real_loss=(real_y-real_predict)/real_y
avg_per_real_loss=sum(abs(per_real_loss))/len(per_real_loss)
print(avg_per_real_loss)
0.12909395542298405
#计算指定置信水平下的预测准确率
#level为小数
def comput_acc(real,predict,level):
    num_error=0
    for i in range(len(real)):
        if abs(real[i]-predict[i])/real[i]>level:
            num_error+=1
    return 1-num_error/len(real)
comput_acc(real_y,real_predict,0.2),comput_acc(real_y,real_predict,0.15),comput_acc(real_y,real_predict,0.1)
(0.8333333333333334, 0.6666666666666667, 0.5833333333333333)
版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请联系我们举报,一经查实,本站将立刻删除。

发布者:全栈程序员-站长,转载请注明出处:https://javaforall.net/148994.html原文链接:https://javaforall.net

(0)
全栈程序员-站长的头像全栈程序员-站长


相关推荐

  • 最短路径问题—Floyd算法详解[通俗易懂]

    最短路径问题—Floyd算法详解[通俗易懂]前言Geniusonlymeanshard-workingallone’slife.Name:WillamTime:2017/3/81、最短路径问题介绍问题解释:从图中的某个顶点出发到达另外一个顶点的所经过的边的权重和最小的一条路径,称为最短路径解决问题的算法:迪杰斯特拉算法(Dijkstra算法)弗洛伊德算法(Floyd算法)SPFA算法之前已经对Di

    2022年6月4日
    39
  • linux使用ps命令查看和控制进程_ps grep 进程

    linux使用ps命令查看和控制进程_ps grep 进程ps命令Linuxps(英文全拼:processstatus)命令用于显示当前进程的状态,类似于windows的任务管理器查看所有进程ps-A显示所有进程信息,连同命令行ps-

    2022年7月29日
    10
  • SQL服务器操作系统和SQL版本的选择

    SQL服务器操作系统和SQL版本的选择

    2021年8月2日
    54
  • sql调用存储过程exec用法_sqlserver存储过程执行日志

    sql调用存储过程exec用法_sqlserver存储过程执行日志一、【存储过程】存储过程的T-SQL语句编译以后可多次执行,由于T-SQL语句不需要重新编译,所以执行存储过程可以 提高性能。存储过程具有以下特点:• 存储过程已在服务器上存储• 存储过程具有安全特性• 存储过程允许模块化程序设计• 存储过程可以减少网络通信流量• 存储过程可以提高运行速度 存储过程分为用户存储过程、系统存储过程和扩展存储过程。存储过程Procedure是一组为了完成…

    2022年8月18日
    36
  • 驱动开发必备硬件知识「建议收藏」

    驱动开发必备硬件知识「建议收藏」综述:在嵌入式领域,可分为硬件开发和软件开发。对于软件开发又可分为底层开发(模块驱动编写,uboot,内核),上层开发(应用,QT)。作为一名软件驱动开发的工程师,我们不需要去设计硬件的原理图,PCB。我们只需看懂硬件开发人员提供的硬件模块时序就行了,但是我们应该也需了解如下硬件知识。      一)处理器     1,可分为通用处理器(单片机,ARM),数字处理器(DSP),其他专用处理器…

    2022年7月22日
    11
  • linux命令大全(2)

    linux命令大全(2)

    2021年7月17日
    53

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注

关注全栈程序员社区公众号