DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

人工智能46

DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

目录

基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

#1、定义数据集

# 2、数据集预处理

# 2.1、数据集切分

# 2.2、数据维度转换

# 2.3、训练集、测试集进行MinMax归一化

# 2.4、依次构建train、test的时序性数据集矩阵

# (1)、for循环构建train时序性数据集矩阵

# (2)、for循环构建test时序性数据集矩阵

# 3、模构建GRU模型

# 3.1、模型构建

# 3.2、模型编译并定义优化器、损失函数

# 3.3、模型训练并保存checkpoint文件

# 使入模数据维度标准化

# 创建并保存weights.tx权重文件

# 模型训练过程可视化:绘制loss

epoch=5

# 3.4、模型评估

# 对真实、预测数据进行MinMax反归一化还原

# 画出真实数据和预测数据的对比曲线

# 输出模型评估指标

# 保存预测结果

相关文章
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

#1、定义数据集

数据集下载:http://quotes.money.163.com/trade/lsjysj_600519.html

DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

日期股票代码名称收盘价最高价最低价开盘价前收盘涨跌额涨跌幅换手率成交量成交金额总市值流通市值2022/6/27 '600519 贵州茅台2010.55 2049.94 2000.3 2019.94 2009.01 1.54 0.0767 0.3193 4011517 8124448900 2.53E+12 2.53E+12 2022/6/24 '600519 贵州茅台2009.01 2020 1965 1970 1957.1 51.91 2.6524 0.3155 3963465 7921199792 2.52E+12 2.52E+12 2022/6/23 '600519 贵州茅台1957.1 1965.04 1940 1942.7 1936 21.1 1.0899 0.2137 2684352 5239860443 2.46E+12 2.46E+12 2022/6/22 '600519 贵州茅台1936 1958 1932 1955 1945.74 -9.74 -0.5006 0.1564 1964665 3813775294 2.43E+12 2.43E+12 2022/6/21 '600519 贵州茅台1945.74 1966.99 1928 1949 1942.02 3.72 0.1916 0.1888 2371702 4617805127 2.44E+12 2.44E+12 2022/6/20 '600519 贵州茅台1942.02 1970 1930 1950 1951 -8.98 -0.4603 0.2784 3497478 6802792459 2.44E+12 2.44E+12 2022/6/17 '600519 贵州茅台1951 1952 1878.09 1878.09 1877 74 3.9425 0.4023 5054161 9749530916 2.45E+12 2.45E+12 2022/6/16 '600519 贵州茅台1877 1907.63 1875.33 1894.59 1875.1 1.9 0.1013 0.214 2688670 5087605391 2.36E+12 2.36E+12 2022/6/15 '600519 贵州茅台1875.1 1905 1862.99 1870 1871 4.1 0.2191 0.268 3366362 6354869100 2.36E+12 2.36E+12 2022/6/14 '600519 贵州茅台1871 1875.42 1832 1834 1856 15 0.8082 0.2342 2941623 5467949348 2.35E+12 2.35E+12 2022/6/13 '600519 贵州茅台1856 1892 1848.08 1890 1900.6 -44.6 -2.3466 0.2926 3675518 6847248995 2.33E+12 2.33E+12 2022/6/10 '600519 贵州茅台1900.6 1907 1835 1845.01 1853 47.6 2.5688 0.3769 4734462 8882462598 2.39E+12 2.39E+12 2022/6/9 '600519 贵州茅台1853 1888.35 1849 1872 1865.6 -12.6 -0.6754 0.2096 2632902 4897066622 2.33E+12 2.33E+12 2022/6/8 '600519 贵州茅台1865.6 1882 1825 1825 1817.9 47.7 2.6239 0.3531 4435381 8236953846 2.34E+12 2.34E+12 2022/6/7 '600519 贵州茅台1817.9 1825 1770.31 1784.14 1788 29.9 1.6723 0.279 3504859 6356031009 2.28E+12 2.28E+12 2022/6/6 '600519 贵州茅台1788 1795 1758 1790 1786 2 0.112 0.2925 3674126 6535329352 2.25E+12 2.25E+12 2022/6/2 '600519 贵州茅台1786 1795.8 1780 1787.97 1788.25 -2.25 -0.1258 0.1347 1691473 3019718032 2.24E+12 2.24E+12 2022/6/1 '600519 贵州茅台1788.25 1814.78 1779 1802 1804.03 -15.78 -0.8747 0.1732 2176001 3897858999 2.25E+12 2.25E+12 2022/5/31 '600519 贵州茅台1804.03 1814.9 1766.98 1774.77 1778.41 25.62 1.4406 0.3244 4075082 7329201058 2.27E+12 2.27E+12 2022/5/30 '600519 贵州茅台1778.41 1790.55 1766 1766 1755.16 23.25 1.3247 0.2744 3446569 6135631304 2.23E+12 2.23E+12

# 2、数据集预处理

# 2.1、数据集切分

training_set
 [2019.94 1970.   1942.7  ...   26.07   25.92   26.5 ]
test_set
 [26.5   0.   25.69 25.6  26.3  25.92 26.   26.24 26.48 26.   25.8  25.8
 25.98 25.78 26.05 26.13 27.2  26.75 26.95 26.7  26.22 26.08 26.03 26.25
 26.5  26.6  27.11 27.1  27.45 26.97 26.79 27.5  27.91 27.78 27.6  27.9
 27.68 27.7  28.   28.15 28.12 28.36 27.98 28.4  28.68 28.97 28.8  28.99
 28.75 29.11 29.01 29.   29.46 30.   30.3  30.35 30.52 30.63 30.4  30.45
 30.56 30.55 30.89 30.73 31.15 31.15 31.   31.   30.59 30.79 30.5  30.98
 30.98 30.7  30.8  31.21 31.42 31.43 31.32 31.44 31.3  31.28 31.52 31.68
 32.2  32.5  32.61 36.3  36.45 36.68 36.37 36.05 35.95 35.68 36.01 35.99
 35.63 36.12 36.18 36.18 36.06 36.68 36.75 36.8  37.08 36.7  36.9  37.28
 39.04 35.   34.98 34.9  34.7  34.55 34.9  35.1  34.8  34.75 35.   34.8
 34.38 34.5  34.9  34.9  35.   34.88 35.21 35.2  35.   35.01 35.88 35.1
 35.54 34.99 34.89 35.25 35.68 35.4  35.57 36.05 36.   36.31 36.48 36.2
 35.5  35.1  35.5  36.19 36.   36.39 37.   38.5  37.88 38.46 37.62 37.49
 37.43 37.   37.3  37.78 36.97 37.02 37.61 37.16 38.   38.01 38.15 38.7
 38.49 38.92 39.3  38.8  38.1  38.12 38.02 38.11 38.31 39.45 39.69 38.55
 38.2  38.8  38.06 37.35 37.95 38.   37.85 37.99 37.6  37.18 37.86 37.93
 37.18 37.5  36.   35.6  35.2  37.   37.24 37.36 36.65 35.8  36.3  34.8
 36.2  36.48 35.98 35.7  37.01 36.98 36.5  37.   37.15 38.72 37.67 37.3
 37.22 36.54 36.45 35.99 34.7  35.9  35.9  35.48 35.11 35.02 35.61 35.6
 36.   36.   36.1  35.9  37.   36.25 35.35 34.83 35.01 35.05 34.58 35.

 35.01 35.22 35.48 35.2  34.15 36.2  33.65 33.64 33.28 34.4  33.7  33.35
 35.   34.8  35.   35.28 35.05 35.   35.25 34.88 34.7  35.7  36.78 36.

 33.3  34.   34.2  34.79 35.13 35.9  35.9  36.01 37.3  36.6  37.   36.9
 36.08 36.11 36.28 36.06 36.28 36.9  36.3  35.88 36.08 36.01 36.01 35.33
 36.8  35.4  36.5  37.35 37.61 37.01 37.2  37.15 36.28 36.98 34.99 34.51]

# 2.2、数据维度转换

进行MinMaxScaler之前,需要将数据从(4754,)→(4754, 1)

before reshape <class 'numpy.ndarray'> (4752,) (300,)
after reshape <class 'numpy.ndarray'> (4752, 1) (300, 1)</class></class>

# 2.3、训练集、测试集进行MinMax归一化

# 2.4、依次构建train、test的时序性数据集矩阵

# (1)、for循环构建train时序性数据集矩阵

提取训练集中连续X_num=60天的开盘价,作为输入特征x_train;以第61天的数据作为label,for循环共构建4752-300-60=4392组数据

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 0 0.78050835 0.761211447 0.750662679 0.755415421 0.75309701 0.753483412 0.725697262 0.732072891 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 1 0.761211447 0.750662679 0.755415421 0.75309701 0.753483412 0.725697262 0.732072891 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 2 0.750662679 0.755415421 0.75309701 0.753483412 0.725697262 0.732072891 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 3 0.755415421 0.75309701 0.753483412 0.725697262 0.732072891 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 4 0.75309701 0.753483412 0.725697262 0.732072891 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 5 0.753483412 0.725697262 0.732072891 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 6 0.725697262 0.732072891 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 7 0.732072891 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 8 0.722571272 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 9 0.708660809 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 10 0.730299307 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 11 0.712915092 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 12 0.723344075 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 0.698614363 13 0.705183193 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 0.698614363 0.681651327 14 0.689394818 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 0.698614363 0.681651327 0.680059351 15 0.691659132 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 0.698614363 0.681651327 0.680059351 0.68014436 16 0.690874736 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 0.698614363 0.681651327 0.680059351 0.68014436 0.6847039 17 0.696295953 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 0.698614363 0.681651327 0.680059351 0.68014436 0.6847039 0.713220349 18 0.685774233 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 0.698614363 0.681651327 0.680059351 0.68014436 0.6847039 0.713220349 0.710979219 19 0.68238549 0.680063215 0.680453481 0.682103417 0.690113525 0.695523149 0.680063215 0.674271053 0.685345327 0.683884729 0.694363944 0.687795114 0.677362267 0.678135071 0.664611009 0.687795114 0.701315312 0.707115202 0.709433612 0.692818337 0.68281826 0.658169692 0.676203062 0.681226285 0.691736412 0.696176168 0.695523149 0.688231748 0.696373233 0.690886328 0.682771892 0.668088625 0.683931097 0.683158293 0.679294276 0.677362267 0.668451843 0.664611009 0.656110171 0.642006507 0.627902843 0.661519795 0.671566241 0.658814983 0.666110248 0.666156616 0.663838206 0.664611009 0.631380459 0.637562887 0.668475027 0.68238549 0.698614363 0.681651327 0.680059351 0.68014436 0.6847039 0.713220349 0.710979219 0.696295953

依次对x_train、y_train打乱数据并转为array格式

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 0 0.076739387 0.078284994 0.077284214 0.07651141 0.074355289 0.074142768 0.073563165 0.073470429 0.072821274 0.074420977 0.073458837 0.074119584 0.073161307 0.075348341 0.073416332 0.068300373 0.064706837 0.069938717 0.074003663 0.077601063 0.077183749 0.079977434 0.080379292 0.081337568 0.08172397 0.081140503 0.082033091 0.079989026 0.080174499 0.079687633 0.082276525 0.08085843 0.079212359 0.079869242 0.080692277 0.080286556 0.077226254 0.08384918 0.085201586 0.084231717 0.085008385 0.085684588 0.087771157 0.088486001 0.097087304 0.095634433 0.098532446 0.099691651 0.09930525 0.091523118 0.088486001 0.096600437 0.103536349 0.098532446 0.098648367 0.098532446 0.095302128 0.095970603 0.096608165 0.101044058 1 0.449385235 0.449771637 0.453998872 0.455954065 0.446294021 0.447066824 0.444748414 0.443589209 0.428326339 0.422723514 0.418473095 0.411904265 0.432468566 0.438295505 0.442430003 0.442236802 0.436247575 0.440807116 0.440497995 0.440494131 0.434701968 0.426973933 0.42851954 0.431610754 0.430065147 0.426973933 0.414605213 0.413488512 0.407653846 0.409972256 0.405660013 0.397220999 0.39800153 0.392646002 0.390385552 0.378094112 0.368434068 0.366888461 0.359740029 0.365149653 0.364763252 0.377325173 0.376741706 0.377321309 0.371730075 0.371706891 0.365524463 0.370292661 0.371834404 0.37094568 0.369013671 0.371525282 0.374036894 0.376915587 0.373959613 0.379176037 0.382522276 0.379033068 0.378403233 0.384489061 2 0.019961514 0.020201083 0.019629209 0.019895826 0.018933686 0.018740485 0.018431363 0.018203386 0.018160882 0.018218842 0.018593652 0.019049606 0.017310798 0.017368759 0.016163185 0.015649271 0.017194878 0.017233518 0.017252838 0.017001677 0.01758128 0.018508644 0.018570468 0.018697981 0.01893755 0.01893755 0.018547284 0.019277583 0.018933686 0.018732757 0.018744349 0.018740485 0.019590569 0.019706489 0.020286092 0.020208812 0.02040974 0.020046523 0.020089027 0.018972326 0.018674797 0.018006322 0.017959953 0.018354083 0.018810037 0.019126887 0.018895046 0.018663205 0.018632292 0.019242807 0.019351 0.01978377 0.019385776 0.019397368 0.018922094 0.018578196 0.01816861 0.018547284 0.018968462 0.018276803 3 0 0.034389756 0.034582957 0.032368875 0.034768429 0.032028841 0.029115372 0.02646852 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0.022685647 0.022546542 0.021978532 0.022326293 0.022666327 0.022276061 0.022990904 0 0.022326293 0.018701845 0.019223487 0.019192575 0.018315443 0.018373403 0.017627648 0.017391943 4 0.101044058 0.096600437 0.09265914 0.095004598 0.097767371 0.098130588 0.100078053 0.103161539 0.101283627 0.100464455 0.098779743 0.102396464 0.097183904 0.099734156 0.101615932 0.102466016 0.104521673 0.101986878 0.100468319 0.106357082 0.110325428 0.110742741 0.104328472 0.100340806 0.098918848 0.097373241 0.093142142 0.094668429 0.097145264 0.099015448 0.098068764 0.09938253 0.09718004 0.095101199 0.094668429 0.094274299 0.097373241 0.097628266 0.097044799 0.095093471 0.098146044 0.099529363 0.097326873 0.102010062 0.10386479 0.09273642 0.088872402 0.092311378 0.080584085 0.080796606 0.078825957 0.077879273 0.077276486 0.07901143 0.078246354 0.076063184 0.074961939 0.075719287 0.075746335 0.075997496 5 0.073690678 0.073744774 0.074988988 0.076256385 0.079591032 0.080371564 0.081913307 0.076507546 0.076240929 0.076314346 0.074710778 0.075730879 0.076314346 0.074146632 0.07722239 0.076507546 0.078903237 0.078439555 0.083605747 0.079308959 0.078427963 0.077485143 0.074189136 0.068926344 0.067573938 0.066113339 0.064355211 0.067968068 0.071866861 0.070321254 0.067832827 0.070321254 0.064374531 0.063215326 0.063366023 0.063872209 0.063350567 0.062983485 0.063570816 0.063060766 0.063655824 0.064876854 0.065205295 0.066770222 0.065301896 0.064714565 0.064760933 0.06186292 0.06237297 0.06182428 0.061252405 0.06221841 0.06414269 0.064842078 0.067233904 0.065301896 0.064359075 0.065842858 0.065143471 0.064181331 6 0.019049606 0.017310798 0.017368759 0.016163185 0.015649271 0.017194878 0.017233518 0.017252838 0.017001677 0.01758128 0.018508644 0.018570468 0.018697981 0.01893755 0.01893755 0.018547284 0.019277583 0.018933686 0.018732757 0.018744349 0.018740485 0.019590569 0.019706489 0.020286092 0.020208812 0.02040974 0.020046523 0.020089027 0.018972326 0.018674797 0.018006322 0.017959953 0.018354083 0.018810037 0.019126887 0.018895046 0.018663205 0.018632292 0.019242807 0.019351 0.01978377 0.019385776 0.019397368 0.018922094 0.018578196 0.01816861 0.018547284 0.018968462 0.018276803 0.018160882 0.018933686 0.018895046 0.018160882 0.018160882 0.018350219 0.018044962 0.017967681 0.018276803 0.0176972 0.01761992 7 0.087535452 0.088482137 0.087713197 0.090147528 0.092006121 0.087666829 0.086940394 0.084621983 0.083447322 0.084618119 0.082342213 0.084544703 0.081994451 0.080282692 0.07959876 0.077906321 0.078516836 0.078725492 0.077496735 0.07728035 0.079208495 0.079135078 0.078277266 0.078053153 0.078091794 0.076306618 0.077183749 0.077272622 0.077662888 0.078439555 0.078482059 0.077767216 0.079985162 0.077666752 0.076507546 0.079115758 0.076163649 0.076932588 0.077705392 0.078501379 0.078891645 0.080754102 0.082218564 0.081144367 0.084235581 0.08432059 0.085077937 0.084475151 0.087125867 0.086940394 0.086546264 0.087114274 0.087635917 0.084235581 0.083269577 0.082840671 0.082848399 0.082612694 0.081217784 0.082110372 8 0.078825957 0.07952148 0.078748677 0.080023802 0.083115016 0.076893948 0.07728035 0.077821312 0.076893948 0 0.071441819 0.071472732 0.068238549 0.067427105 0.066665894 0.068702231 0.067628034 0.069629595 0.069127273 0.068586311 0.068779511 0.070325118 0.071870725 0.070518319 0.069552315 0.070904721 0.072284175 0.068694503 0.066461101 0.066310404 0.066468829 0.069057721 0.070518319 0.069583227 0.071097922 0.07071152 0.074382337 0.072570113 0.074196864 0.074579402 0.07370227 0.073029931 0.072662849 0.070904721 0.068779511 0.068199909 0.069042265 0.071097922 0.073026067 0.068045348 0.067658946 0.063694464 0.062098625 0.056607856 0.058551457 0.056406927 0.056287143 0.053713707 0.054660392 0.055641852 9 0.096310636 0.096527021 0.094710933 0.094579556 0.093748792 0.093644464 0.093895625 0.093115094 0.094784349 0.092357746 0.090804411 0.091963616 0.089826815 0.088486001 0.088219383 0.089220164 0.092558675 0.093041677 0.094668429 0.093122822 0.092910301 0.093316023 0.09285234 0.093397167 0.090108888 0.09051461 0.089838407 0.091577215 0 0.084235581 0.083845316 0.086940394 0.088486001 0.089065603 0.087794342 0.089606566 0.091546303 0.089904095 0.088721706 0.092125905 0.094382491 0.095433504 0.095827634 0.096013107 0.096986839 0.097295961 0.097550986 0.096144483 0.095302128 0.093702424 0.093938129 0.095124383 0.096407237 0.096310636 0.096940471 0.098493806 0.096600437 0.092972125 0.092690052 0.094857766 10 0.772370729 0.772842139 0.78632756 0.78632756 0.772803499 0.763116407 0.792123587 0.799851622 0.763143456 0.763916259 0.755415421 0.801768174 0.772803499 0.809511665 0.788259569 0.842355814 0.841969412 0.811443674 0.853561465 0.891811374 0.875254059 0.948616295 0.947132513 1 0.960208348 0.915308465 0.903020889 0.898384068 0.846606233 0.830763762 0.816165504 0.823035727 0.811903492 0.803715639 0.827630044 0.844287823 0.804874844 0.79946522 0.791350783 0.775894713 0.801053331 0.79639719 0.8172397 0.833082172 0.836173386 0.806806853 0.807579657 0.827672548 0.810284469 0.797842333 0.768939482 0.772795771 0.750005796 0.722571272 0.723730477 0.705801436 0.696678491 0.702478381 0.733386657 0.714831645 11 0.3091214 0.300620561 0.299867078 0.301393365 0.305249654 0.304484579 0.303904976 0.290709356 0.290237946 0.282169878 0.28099908 0.285473613 0.277436456 0.275118046 0.27820926 0.285937295 0.284090294 0.287482902 0.279751003 0.284051654 0.285937295 0.288255705 0.279754867 0.277703073 0.271833631 0.273989753 0.269720013 0.255005835 0.259267846 0.257343565 0.255025155 0.264781799 0.267390011 0.266230805 0.261207583 0.257347429 0.263139591 0.254967194 0.259275574 0.264685198 0.262977303 0.269901622 0.272026832 0.274345242 0.272335953 0.26936066 0.263525993 0.261980386 0.26275319 0.2650716 0.263603274 0.270469633 0.279368465 0.273572439 0.270867626 0.287482902 0.289843816 0.289801312 0.28747131 0.288873948 12 0.061051476 0.061360598 0.061256269 0.058609417 0.057960262 0.057921622 0.058153463 0.058083911 0.057284059 0.058420081 0.059073099 0.059884543 0.059042187 0.058118687 0.057527492 0.057110179 0.056414655 0.058582369 0.057284059 0.057110179 0.055672764 0.054869048 0.05525545 0.053566875 0.053818036 0.055228402 0.055100889 0.054134885 0.05437059 0.053392994 0.053632563 0.052747703 0.05254291 0.051410753 0.050579989 0.050425428 0.050000386 0.050518165 0.050541349 0.049683537 0.049115526 0.049304863 0.04864798 0.048879821 0.048899141 0.049401464 0.04868662 0.049656489 0.050309508 0.050154947 0.050970255 0.051005031 0.051491897 0.051159592 0.052554502 0.053787124 0.054702896 0.054637207 0.054181253 0.054505831 13 0.118857178 0.120905107 0.119842503 0.119251308 0.121909752 0.122682556 0.122535723 0.122137729 0.124819357 0.122010216 0.119409733 0.121148541 0.122025673 0.115843245 0.114000108 0.110897302 0.110974582 0.112064235 0.111109823 0.11090503 0.106260481 0.106368674 0.108180898 0.105785207 0.107048741 0.108103617 0.105719519 0.106206385 0.104602818 0.103443612 0.105622918 0.106801444 0.106840084 0.108965293 0.103169267 0.101627524 0.102551024 0.098918848 0.09718004 0.09765145 0.097369377 0.098331517 0.098304469 0.096492245 0.095379408 0.094730253 0.096244948 0.096990703 0.096932743 0.09891112 0.098153772 0.097508482 0.096407237 0.09665067 0.099425034 0.099112049 0.100464455 0.096812958 0.095591929 0.094989142 14 0.088099599 0.089258804 0.089305172 0.09059189 0.090727131 0.087519996 0.087319067 0.087326795 0.083729395 0.082786575 0.082091052 0.081990587 0.081808978 0.080487484 0.082604966 0.081302792 0.082241748 0.082110372 0.082110372 0.080700005 0.079212359 0.080866158 0.080197683 0.079451928 0.075773383 0.077728576 0.077589471 0.076893948 0.081229376 0.081577137 0.080178363 0.078949605 0.080294284 0.080197683 0.080294284 0.077585607 0.076909404 0.077894729 0.07731899 0.077326718 0.075228557 0.074656682 0.075240149 0.076893948 0.077608791 0.078296587 0.078640484 0.077666752 0.077060101 0.07680894 0.075031492 0.074390065 0.074482801 0.074107992 0.074560082 0.074915571 0.074776467 0.074471209 0.074455753 0.073223132 15 0.009273642 0.00937797 0.009273642 0.009111353 0.009115217 0.009088169 0.009092033 0.009177042 0.009188634 0.009149993 0.009235002 0.009041801 0.009103625 0.008983841 0.008960657 0.009003161 0.008922016 0.008922016 0.00888724 0.009003161 0.008918152 0.009034073 0.00882928 0.00879064 0.00888724 0.008918152 0.00880996 0.008883376 0.00888724 0.00890656 0.00888724 0.009003161 0.009146129 0.008979977 0.008794504 0.008763592 0.008694039 0.008539479 0.008241949 0.008191717 0.00829991 0.008346278 0.008427422 0.008288318 0.008265133 0.008261269 0.008292182 0.008369462 0.008404238 0.00833855 0.008346278 0.008164669 0.008191717 0.008160805 0.008160805 0.008153077 0.008180125 0.008191717 0.008176261 0.008075797 16 0.875254059 0.948616295 0.947132513 1 0.960208348 0.915308465 0.903020889 0.898384068 0.846606233 0.830763762 0.816165504 0.823035727 0.811903492 0.803715639 0.827630044 0.844287823 0.804874844 0.79946522 0.791350783 0.775894713 0.801053331 0.79639719 0.8172397 0.833082172 0.836173386 0.806806853 0.807579657 0.827672548 0.810284469 0.797842333 0.768939482 0.772795771 0.750005796 0.722571272 0.723730477 0.705801436 0.696678491 0.702478381 0.733386657 0.714831645 0.710979219 0.715542624 0.713104429 0.707308403 0.708467608 0.706253526 0.708962202 0.710979219 0.721006345 0.701319176 0.696566434 0.676975865 0.671570105 0.674657455 0.66692942 0.670407036 0.669502856 0.66692942 0.680646682 0.683927233 17 0.173540754 0.171937187 0.17312344 0.173880787 0.170982774 0.173880787 0.174197637 0.173382329 0.175001352 0.170886174 0.169645824 0.1659866 0.164240064 0.165766351 0.162134174 0.159630291 0.1598119 0.157806475 0.157806475 0.156276324 0.16081268 0.160820408 0.160213757 0.158811119 0.15904296 0.163057674 0.160561519 0.161844373 0.156415428 0.154800269 0.160356726 0.156913887 0.155719905 0.15247413 0.151083084 0.154340451 0.150998076 0.14887673 0.149943199 0.152242289 0.151852024 0.150314145 0.149019699 0.149228356 0.14799187 0.147222931 0.146813345 0.149162667 0.151593134 0.151481078 0.150901475 0.149363596 0.147025866 0.144166493 0.145503443 0.142775446 0.14356757 0.142987967 0.142937735 0.141809442 18 0.288622787 0.283108834 0.284391688 0.286516897 0.285550893 0.286876251 0.292506124 0.290690036 0.286826019 0.277436456 0.274345242 0.281686875 0.278220852 0.273186037 0.270481225 0.267390011 0.274252506 0.283819813 0.275233966 0.275890849 0.290574116 0.295717123 0.29598374 0.291346919 0.295682347 0.286323696 0.297142945 0.298765833 0.305257382 0.3091214 0.300620561 0.299867078 0.301393365 0.305249654 0.304484579 0.303904976 0.290709356 0.290237946 0.282169878 0.28099908 0.285473613 0.277436456 0.275118046 0.27820926 0.285937295 0.284090294 0.287482902 0.279751003 0.284051654 0.285937295 0.288255705 0.279754867 0.277703073 0.271833631 0.273989753 0.269720013 0.255005835 0.259267846 0.257343565 0.255025155 19 0.715731961 0.720264453 0.726435289 0.716775246 0.711404261 0.708274407 0.695523149 0.729526503 0.730337947 0.745341927 0.722571272 0.721025665 0.710789882 0.704186277 0.702547933 0.699000765 0.714070433 0.676203062 0.629062048 0.632926066 0.640974814 0.625970834 0.619402005 0.637562887 0.641040503 0.643358913 0.631411371 0.628289245 0.641426904 0.641824898 0.637176485 0.622114545 0.630990193 0.602400328 0.614301502 0.620174808 0.61399238 0.643355049 0.636650979 0.608582756 0.594239523 0.600920409 0.626828646 0.627771467 0.656882974 0.655337367 0.653018957 0.66924783 0.687798978 0.656882974 0.637949289 0.652632555 0.654131794 0.670043818 0.668475027 0.642972511 0.666543018 0.699391031 0.65804218 0.696682355

# (2)、for循环构建test时序性数据集矩阵

测试集:csv表格中后300天数据,for循环共构建300-60=240组数据。

将df格式数据转为array格式

# 3、模构建GRU模型

# 3.1、模型构建

# 3.2、模型编译并定义优化器、损失函数

# 3.3、模型训练并保存checkpoint文件

# 使入模数据维度标准化

x_train要reshape成符合RNN输入要求:[样本数, 循环核时间展开步数, 每个时间步输入特征个数]

before x_train.shape[0]: 4692
after x_train.shape: (4692, 60, 1)

# 创建并保存weights.tx权重文件

DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
gru (GRU)                    (None, 60, 80)            19680
_________________________________________________________________
dropout (Dropout)            (None, 60, 80)            0
_________________________________________________________________
gru_1 (GRU)                  (None, 100)               54300
_________________________________________________________________
dropout_1 (Dropout)          (None, 100)               0
_________________________________________________________________
dense (Dense)                (None, 1)                 101
=================================================================
Total params: 74,081
Trainable params: 74,081
Non-trainable params: 0
_________________________________________________________________

# 模型训练过程可视化:绘制loss

epoch=5

DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

# 3.4、模型评估

# 对真实、预测数据进行MinMax反归一化还原

# 画出真实数据和预测数据的对比曲线

DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

# 输出模型评估指标

R2: 0.5177
MSE: 1.8693
RMSE: 1.3672
MAE: 1.2081

None
R2: 0.8342
MSE: 0.6269
RMSE: 0.7918
MAE: 0.5756

DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

# 保存预测结果

Original: https://blog.csdn.net/qq_41185868/article/details/125510867
Author: 一个处女座的程序猿
Title: DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)



相关阅读

Title: 睿智的目标检测61——Tensorflow2 Focal loss详解与在YoloV4当中的实现

睿智的目标检测60——Tensorflow2 Focal loss详解与在YoloV4当中的实现

学习前言

TF2的也补上咯。其实和Keras的一摸一样0 0。
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

; 什么是Focal Loss

Focal Loss是一种Loss计算方案。其具有两个重要的特点。

1、 控制正负样本的权重
2、 控制容易分类和难分类样本的权重

正负样本的概念如下:
目标检测本质上是进行密集采样,在一张图像生成成千上万的先验框(或者特征点),将真实框与部分先验框匹配,匹配上的先验框就是正样本,没有匹配上的就是负样本

容易分类和难分类样本的概念如下:
假设存在一个二分类问题,样本1和样本2均为类别1。网络的预测结果中,样本1属于类别1的概率=0.9,样本2属于类别1的概率=0.6,前者预测的比较准确,是容易分类的样本;后者预测的不够准确,是难分类的样本。

如何实现权重控制呢,请往下看:

一、控制正负样本的权重

如下是常用的交叉熵loss,以二分类为例:
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)
我们可以利用如下Pt简化交叉熵loss。
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)
此时:
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)
想要降低负样本的影响,可以在常规的损失函数前增加一个系数αt。与Pt类似:
当label=1的时候,αt=α;
当label=otherwise的时候,αt=1 - α。
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)
a的范围是0到1。此时我们便可以通过设置α实现控制正负样本对loss的贡献DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)
分解开就是:
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

; 二、控制容易分类和难分类样本的权重

样本属于某个类,且预测结果中该类的概率越大,其越容易分类 ,在二分类问题中,正样本的标签为1,负样本的标签为0,p代表样本为1类的概率。

对于正样本而言,1-p的值越大,样本越难分类。
对于负样本而言,p的值越大,样本越难分类。

Pt的定义如下:
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)
所以利用1-Pt就可以计算出每个样本属于容易分类或者难分类。

具体实现方式如下。
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)
其中:
( 1 − p t ) γ (1-p_{t})^{γ}(1 −p t ​)γ
就是每个样本的容易区分程度,γ γγ称为调制系数

1、当pt趋于0的时候,调制系数趋于1,对于总的loss的贡献很大。当pt趋于1的时候,调制系数趋于0,也就是对于总的loss的贡献很小。
2、当γ=0的时候,focal loss就是传统的交叉熵损失,可以通过调整γ实现调制系数的改变。

三、两种权重控制方法合并

通过如下公式就可以实现 控制正负样本的权重控制容易分类和难分类样本的权重
DL之GRU(Tensorflow框架):基于茅台股票数据集利用GRU算法实现回归预测(保存模型.ckpt.index、.ckpt.data文件)

; 实现方式

本文以Keras版本的YoloV4为例,给大家进行解析,YoloV4的坐标如下:
https://github.com/bubbliiiing/yolov4-tf2

首先定位YoloV4中, 正负样本区分的损失部分,YoloV4的损失由三部分组成,分别为:
location_loss(回归损失)
confidence_loss(目标置信度损失)
class_loss(种类损失)
正负样本区分的损失部分是confidence_loss(目标置信度损失),因此我们在这一部分添加Focal Loss。

首先定位公式中的概率p。raw_pred代表每个特征点的预测结果,取出其中属于置信度的部分,取sigmoid,就是概率p

tf.sigmoid(raw_pred[...,4:5])

首先进行正负样本的平衡,设立参数alpha。

alpha
1-alpha

然后进行难易分类样本的平衡,设立参数gamma。

(tf.ones_like(raw_pred[...,4:5]) - tf.sigmoid(raw_pred[...,4:5])) ** gamma
tf.sigmoid(raw_pred[...,4:5]) ** gamma

乘上原来的交叉熵损失即可。

confidence_loss = object_mask * (tf.ones_like(raw_pred[...,4:5]) - tf.sigmoid(raw_pred[...,4:5])) ** gamma * alpha * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True) + \
            (1 - object_mask) * ignore_mask * tf.sigmoid(raw_pred[...,4:5]) ** gamma * (1 - alpha) * K.binary_crossentropy(object_mask, raw_pred[...,4:5], from_logits=True)

Original: https://blog.csdn.net/weixin_44791964/article/details/123595615
Author: Bubbliiiing
Title: 睿智的目标检测61——Tensorflow2 Focal loss详解与在YoloV4当中的实现

相关文章
硬核图解,再填猛男,YOLO详解! 人工智能

硬核图解,再填猛男,YOLO详解!

​ PS:大司马金轮,技术教程:点击查看 大家好,我是 Jack。 承诺的图解 AI 算法系列教程, 今天咱们继续! 这个系列一直写的比较随性,想写哪个算法就写了哪个,毫无章法。 「修炼开始」一文带你...
多模态数据融合 人工智能

多模态数据融合

1.多模态数据含义: 狭义:多媒体数据,如文本,音频,视频 广义:对原始数据集采用不同的特征提取方法得到的不同特征组合 2.数据融合: 整合从多模态数据中得到的补充信息,以提升模型分类性能,被视为 多...
【好数推荐】数据堂平均音色语音库 人工智能

【好数推荐】数据堂平均音色语音库

语音合成,即是把文字变成声音的技术,声音是文字内容的信息载体。语音交互是日常生活中最常见、最被人熟悉并乐于接受的展现形式,语音交互体验效果的好坏,会对用户的感知造成很大影响。 如果语音合成质量较好,说...
使用KMeans对iris数据集聚类 人工智能

使用KMeans对iris数据集聚类

一、聚类分析的基本知识 聚类分析也称聚类,它与分类是不同的,分类的目标变量是已知的,每个样本都存在类标签,而聚类的目标变量是事先不知道的,聚类的样本类别没有被预先定义出来。聚类是根据聚类算法或样本对象...
语音识别-浅谈语言模型 人工智能

语音识别-浅谈语言模型

统计语言模型 统计语言模型是所有词序列上的一个概率分布 在语料库中计数,语料库分为训练集(training set),验证集/保留集(dev/held-out set),测试集(testing set...
conda简记 人工智能

conda简记

抵扣说明: 1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。 2.余额无法直接购买下载,可以购买VIP、C币套餐、付费专栏及课程。 Original: https://blog.cs...
PID控制器的介绍 人工智能

PID控制器的介绍

PID 控制算法介绍 在工程实际中,应用最为广泛的调节器控制规律为比例、积分、微分控制,简称 PID 控制,又称 PID 调节。PID 控制器问世至今已有近 70 年历史,它以其结构简单、稳定性好、工...