0

我用反向传播神经网络碰了壁,我不知道可能是什么问题,请帮忙。

在这个示例运行中,我使用了 3 层 ann。

第一层有 10 个输入神经元 第二层有 7 个隐藏神经元 第三层有 2 个输出神经元

我对隐藏神经元使用 sigmoid 激活函数,对输出神经元使用超正切(即 tanh)激活函数。

在这个测试中,我使用了 0.1 的学习率和 0.01 的动量常数(我也尝试了其他不同的组合,并且输出仍然存在相同的问题)。

如下图所示,训练算法在 n 次迭代后停止,但误差仍然很高。

我正在使用下面的数据集进行训练,训练后我也使用它来测试网络。

我通过将所有十个输入映射到 0 和 1 之间的值来训练网络。只需将每个输入除以 5000,如果结果大于 1,我将其设置为 1。对于输出,我将它们映射到 - 1 和 1,除以 100。

如您所见,不同输入的输出相同。

有谁知道可能是什么问题?

非常欢迎您的意见。

谢谢

** * **样本输出* ** * ** *

T1 为 0,ANN 输出为 0.0904967273528686 T2 为 -0.01,ANN 输出为 -0.399660970430866

T1 为 0,ANN 输出为 0.0904967273528686 T2 为 -0.14,ANN 输出为 -0.399660970430866

T1 为 0,ANN 输出为 0.0904967273528686 T2 为 -0.14,ANN 输出为 -0.399660970430866

T1 为 0,ANN 输出为 0.0904967273528686 T2 为 -0.14,ANN 输出为 -0.399660970430866

T1 为 0,ANN 输出为 0.0904967273528686 T2 为 -0.24,ANN 输出为 -0.399660970430866

T1 为 0,ANN 输出为 0.0904967273528686 T2 为 -0.32,ANN 输出为 -0.399660970430866

T1 为 0.04,ANN 输出为 0.0904967273528686 T2 为 -0.32,ANN 输出为 -0.399660970430866

T1 为 0.04,ANN 输出为 0.0904967273528686 T2 为 -0.32,ANN 输出为 -0.399660970430866

T1 为 0.16,ANN 输出为 0.0904967273528686 T2 为 -0.32,ANN 输出为 -0.399660970430866

T1 为 0.18,ANN 输出为 0.0904967273528686 T2 为 -0.32,ANN 输出为 -0.399660970430866

T1 为 0.18,ANN 输出为 0.0904967273528686 T2 为 -0.35,ANN 输出为 -0.399660970430866

T1 为 0.18,ANN 输出为 0.0904967273528686 T2 为 -0.35,ANN 输出为 -0.399660970430866

*样本训练数据* ** * ***和训练后的测试数据 1768,1946,2442,4305,7542,6584,2324,1487,1099,975,0,-1 1767,1946,2444,4309,7551,6575,2318,1483,1100,976, 0,-14 1764,1942,2441,4212,7555,6565,2311,1480,1097,972,0,-14 1764,1942,2441,4212,7555,6565,2311,1480,1097,972,0, -14 1762,1944,2445,4339,7571,6500,2302,1478,1095,970,0,-24 1762,1944,2445,4339,7571,6500,2302,1478,1095,970,0,-32 1762,1948,2460,4404,7628,6408,2266,1482,1100,973,4,-32 1762,1948,2460,4404,7628,6408,2266,1482,1100,973,4,-32 1742, 1909,2381,4186,7375,6882,2432,1533,1163,999,16,-32 1727,1879,2316,4142,7139,7201,2589,1563,1207,1015,18,-32 1712,1859, 2288,4232,7051,7275,2658,1570,1236,1038,18,-35 1712,1859,2288,4232,7051,7275,2658,1570,1236,1038,18,-35 1693,1829,2240, 4146,6892,7485,2784,1592,1267,1086,18,-37 1693,1829,2240,4146,6892,7485,2784,1592,1267,1086,16,-40 1669,1802,2199,4229, 6766,7528,2966,1624,1282,1126,16,-40 1669,1802,2199,4229,6766,7528,2966,1624,1282,1126,16,-40 1669,1802,2199,4229,6766,7528,2966,1624,1282,1126,16, -40 1648,1772,2153,4301,6602,7608,3244,1681,1289,1160,14,-40 1648,1772,2153,4301,6602,7608,3244,1681,1289,1160,4,-40 1648,1772,2153,4301,6602,7608,3244,1681,1289,1160,4,-40 1640,1790,2217,4722,7082,7233,2860,1612,1249,1159,4,-40 1632, 1788,2227,5122,7214,7055,2698,1603,1249,1140,4,-40 1632,1788,2227,5122,7214,7055,2698,1603,1249,1140,4,-40 1621,1782, 3170,4590,7293,4816,2829,1608,1257,1118,4,-40 1621,1782,3170,4590,7293,4816,2829,1608,1257,1118,4,-40 1616,1780,4302, 3004,6751,6844,2860,1606,1262,1121,4,-40 1616,1780,4302,3004,6751,6844,2860,1606,1262,1121,4,-40 1616,1780,4302,3004, 6751,6844,2860,1606,1262,1121,4,-40 1592,1025,5023,3609,7446,6739,2852,1602,1265,1125,4,-40 1592,1025,5023,3609,7446, 6739,2852,1602,1265,1125,4,-40 1592,1025,5023,3609,7446,6739,2852,1602,1265,1125,4,-40 1604,1779,5664,3645,7522,6605,2863,1595,1264,1124,4,-40 1601,1945,6094, 3701,7592,6487,2918,1590,1262,1127,7,-40 1587,3426,5418,3561,7437,6662,3140,1626,1288,1142,7,-40 1587,3426,5418,3561, 7437,6662,3140,1626,1288,1142,7,-40 1581,5498,4155,3570,7458,6616,3251,1632,1296,1153,7,-40 1581,5498,4155,3570,7458, 6616,3251,1632,1296,1153,7,-40 1581,5498,4155,3570,7458,6616,3251,1632,1296,1153,7,-40 1581,5498,4155,3570,7458,6616, 3251,1632,1296,1153,7,-40 1581,5498,4155,3570,7458,6616,3251,1632,1296,1153,7,-40 2795,7097,2235,3606,7513,6514,3412, 1634,1305,1165,7,-40 2795,7097,2235,3606,7513,6514,3412,1634,1305,1165,7,-40 4374,5780,2240,3635,7554,6458,3337,2228, 1306,1169,7,-40 4374,5780,2240,3635,7554,6458,3337,2228,1306,1169,7,-40 4374,5780,2240,3635,7554,6458,3337,2228,1306, 1169,7,-40 6657,2127,2247,3679,7613,6373,3085,3845,1307,1173,7,-40 5794,1755,2249,3691,7632,6333,2983,5134,1300,1175,9,-40 4045,1735,2208,3576,7510,6527,2932, 4549,3576,1188,9,-40 4045,1735,2208,3576,7510,6527,2932,4549,3576,1188,9,-40 4045,1735,2208,3576,7510,6527,2932,4549, 3576,1188,9,-40 4045,1735,2208,3576,7510,6527,2932,4549,3576,1188,9,-40 1543,1725,2194,3556,7501,6556,2787,2384,8183, 1484,9,-40 1534,1718,2183,3534,7483,6590,2713,2255,6696,3743,9,-40 1529,1709,2171,3508,7461,6636,2673,2143,5150,5561, 9,-40 1525,1705,2162,3492,7446,6673,2659,2042,3743,7173,9,-40 1525,1705,2162,3492,7446,6673,2659,2042,3743,7173,9, -40 1524,1704,2160,3487,7443,6677,2660,2023,3561,7348,9,-40 1521,1701,2155,3478,7434,6702,2662,1919,1940,8149,9,-40 1521,1701,2155,3478,7434,6702,2662,1919,1940,8149,9,-40 1521,1701,2155,3478,7434,6702,2662,1919,1940,8149,9,-40 1521, 1701,2155,3478,7434,6702,2662,1919,1940,8149,9,-40 1521,1701,2155,3478,7434,6702,2662,1919,1940,8149,9,-40 1508,1677,2122,3425,7400,6764,2702,1741,1585,1947,9, -40 1508,1677,2122,3425,7400,6764,2702,1741,1585,1947,9,-40 1505,1672,2119,3415,7401,6772,2709,1756,1476,1659,9,-40 1505,1672,2119,3415,7401,6772,2709,1756,1476,1659,9,-40 1505,1672,2119,3415,7401,6772,2709,1756,1476,1659,9,-40 1505, 1672,2119,3415,7401,6772,2709,1756,1476,1659,9,-40 1483,1652,2093,3370,7356,6817,2789,1723,1409,130​​8,9,-40 1477,1648, 2085,3357,7348,6837,2830,1704,1418,1267,9,-40 1476,1642,2076,3337,7323,6866,2873,1705,1416,1276,9,-40 1476,1642,2076, 3337,7323,6866,2873,1705,1416,1276,9,-40 1476,1642,2076,3337,7323,6866,2873,1705,1416,1276,9,-40 1472,1638,2073,3336, 7329,6862,2888,1730,1395,1282,9,-40 1461,1626,2057,3314,7311,6879,2900,1759,1369,1285,9,-40 1461,1626,2056,3314,7310, 6880,2902,1764,1370,1284,9,-40 1458,1621,2052,3304,7311,6891,2915,1792,1382,1262,9,-40 1453,1616,2043,3292,7302,6908,2929,1821,1386,1250,9,-40 1444,1609,2034, 3275,7285,6931,2952,1846,1416,1259,9,-40 1443,1610,2036,3278,7287,6924,2947,1847,1418,1261,9,-40 1440,1606,2031,3277, 7297,6916,2944,1846,1435,1263,9,-40 1432,1597,2017,3248,7265,6954,2971,1858,1469,1272,9,-40 1432,1597,2017,3248,7265, 6954,2971,1858,1469,1272,9,-40 1431,1591,2013,3247,7265,6948,2974,1863,1476,1298,9,-40 1431,1591,2013,3247,7265,6948, 2974,1863,1476,1298,9,-40 1428,1587,2008,3240,7263,6947,2982,1870,1483,1326,7,-61272,9,-40 1432,1597,2017,3248,7265,6954,2971,1858,1469,1272,9,-40 1431,1591,2013,3247,7265,6948,2974,1863,1476,1298, 9,-40 1431,1591,2013,3247,7265,6948,2974,1863,1476,1298,9,-40 1428,1587,2008,3240,7263,6947,2982,1870,1483,1326,7, -61272,9,-40 1432,1597,2017,3248,7265,6954,2971,1858,1469,1272,9,-40 1431,1591,2013,3247,7265,6948,2974,1863,1476,1298, 9,-40 1431,1591,2013,3247,7265,6948,2974,1863,1476,1298,9,-40 1428,1587,2008,3240,7263,6947,2982,1870,1483,1326,7, -6


至于谨慎,我认为训练数据可能有太多类似的输出,所以我手动更改了一些输出以使其更真实;并且输出相同的问题仍然存在。

这是我修改后的训练数据集,

1768,1946,2442,4305,7542,6584,2324,1487,1099,975,-1,-6 1767,1946,2444,4309,7551,6575,2318,1483,1100,976,0,-44 1764 ,1942,2441,4212,7555,6565,2311,1480,1097,972,1,-54 1764,1942,2441,4212,7555,6565,2311,1480,1097,972,3,-44 1762,1944 ,2445,4339,7571,6500,2302,1478,1095,970,3,-24 1762,1944,2445,4339,7571,6500,2302,1478,1095,970,-2,-32 1762,1948, 2460,4404,7628,6408,2266,1482,1100,973,4,-32 1762,1948,2460,4404,7628,6408,2266,1482,1100,973,4,-32 1742,1909,2381, 4186,7375,6882,2432,1533,1163,999,16,-32 1727,1879,2316,4142,7139,7201,2589,1563,1207,1015,18,-32 1712,1859,2288,4232, 7051,7275,2658,1570,1236,1038,22,-35 1712,1859,2288,4232,7051,7275,2658,1570,1236,1038,20,-35 1693,1829,2240,4146,6892, 7485,2784,1592,1267,1086,30,-37 1693,1829,2240,4146,6892,7485,2784,1592,1267,1086,15,-40 1669,1802,2199,4229,6766,7528, 2966,1624,1282,1126,16,-60 1669,1802,2199,4229,6766,7528,2966,1624,1282,1126,17,-50 1669,1802,2199,4229,6766,7528,2966,1624,1282,1126,18,-30 1648,1772,2153,4301,6602, 7608,3244,1681,1289,1160,19,-10 1648,1772,2153,4301,6602,7608,3244,1681,1289,1160,3,-80 1648,1772,2153,4301,6602,7608, 3244,1681,1289,1160,-3,-90 1640,1790,2217,4722,7082,7233,2860,1612,1249,1159,2,-70 1632,1788,2227,5122,7214,7055,2698 ,1603,1249,1140,-2,-40 1632,1788,2227,5122,7214,7055,2698,1603,1249,1140,4,-40 1621,1782,3170,4590,7293,4816,2829, 1608,1257,1118,-4,-20 1621,1782,3170,4590,7293,4816,2829,1608,1257,1118,6,-40 1616,1780,4302,3004,6751,6844,2860,1606 ,1262,1121,-6,-10 1616,1780,4302,3004,6751,6844,2860,1606,1262,1121,5,-40 1616,1780,4302,3004,6751,6844,2860,1606, 1262,1121,-5,-50 1592,1025,5023,3609,7446,6739,2852,1602,1265,1125,2,-40 1592,1025,5023,3609,7446,6739,2852,1602,1265 ,1125,-2,-60 1592,1025,5023,3609,7446,6739,2852,1602,1265,1125,1,-70 1604,1779,5664,3645,7522,6605,2863,1595,1264,1124,-1,-80 1601,1945,6094,3701,7592,6487,2918 ,1590,1262,1127,5,-40 1587,3426,5418,3561,7437,6662,3140,1626,1288,1142,6,-40 1587,3426,5418,3561,7437,6662,3140,1626 ,1288,1142,7,-60 1581,5498,4155,3570,7458,6616,3251,1632,1296,1153,8,-40 1581,5498,4155,3570,7458,6616,3251,1632,1296 ,1153,4,-60 1581,5498,4155,3570,7458,6616,3251,1632,1296,1153,3,-40 1581,5498,4155,3570,7458,6616,3251,1632,1296,1153 ,2,-60 1581,5498,4155,3570,7458,6616,3251,1632,1296,1153,1,-40 2795,7097,2235,3606,7513,6514,3412,1634,1305,1165,2 ,-50 2795,7097,2235,3606,7513,6514,3412,1634,1305,1165,3,-40 4374,5780,2240,3635,7554,6458,3337,2228,1306,1169,4,- 80 4374,5780,2240,3635,7554,6458,3337,2228,1306,1169,5,-40 4374,5780,2240,3635,7554,6458,3337,2228,1306,1169,6,-90 6657 ,2127,2247,3679,7613,6373,3085,3845,1307,1173,-7,-40 5794,1755,2249,3691,7632,6333,2983,5134,1300,1175,-6,-90 4045,1735,2208,3576,7510,6527,2932,4549,3576, 1188,-5,-40 4045,1735,2208,3576,7510,6527,2932,4549,3576,1188,-4,-70 4045,1735,2208,3576,7510,6527,2932,4549,3576, 1188,-3,-40 4045,1735,2208,3576,7510,6527,2932,4549,3576,1188,-2,-30 1543,1725,2194,3556,7501,6556,2787,2384,8183, 1484,-1,-30 1534,1718,2183,3534,7483,6590,2713,2255,6696,3743,0,-40 1529,1709,2171,3508,7461,6636,2673,2143,5150,5561 ,1,-40 1525,1705,2162,3492,7446,6673,2659,2042,3743,7173,2,-50 1525,1705,2162,3492,7446,6673,2659,2042,3743,7173,3 ,-40 1524,1704,2160,3487,7443,6677,2660,2023,3561,7348,4,-50 1521,1701,2155,3478,7434,6702,2662,1919,1940,8149,5,- 40 1521,1701,2155,3478,7434,6702,2662,1919,1940,8149,6,-50 1521,1701,2155,3478,7434,6702,2662,1919,1940,8149,7,-40 1521 ,1701,2155,3478,7434,6702,2662,1919,1940,8149,8,-40 1521,1701,2155,3478,7434,6702,2662,1919,1940,8149,9,-60 1508,1677,2122,3425,7400,6764,2702,1741,1585,1947,8,-40 1508,1677,2122,3425,7400,6764,2702,1741,1585,1947,7,-70 1505,1672,2119,3415,7401,6772,2709,1756,1476,1659,6,-40 1505, 1672,2119,3415,7401,6772,2709,1756,1476,1659,5,-80 1505,1672,2119,3415,7401,6772,2709,1756,1476,1659,4,-40 1505,1672, 2119,3415,7401,6772,2709,1756,1476,1659,3,-70 1483,1652,2093,3370,7356,6817,2789,1723,1409,130​​8,2,-40 1477,1648,2085, 3357,7348,6837,2830,1704,1418,1267,1,-70 1476,1642,2076,3337,7323,6866,2873,1705,1416,1276,0,-40 1476,1642,2076,3337, 7323,6866,2873,1705,1416,1276,-1,-50 1476,1642,2076,3337,7323,6866,2873,1705,1416,1276,-2,-40 1472,1638,2073,3336, 7329,6862,2888,1730,1395,1282,-3,-90 1461,1626,2057,3314,7311,6879,2900,1759,1369,1285,-4,-40 1461,1626,2056,3314, 7310,6880,2902,1764,1370,1284,-5,-90 1458,1621,2052,3304,7311,6891,2915,1792,1382,1262,-6,-40 1453,1616,2043,3292,7302,6908,2929,1821,1386,1250,-7,-90 1444,1609, 2034,3275,7285,6931,2952,1846,1416,1259,-8,-40 1443,1610,2036,3278,7287,6924,2947,1847,1418,1261,-9,-70 1440,1606, 2031,3277,7297,6916,2944,1846,1435,1263,-8,-40 1432,1597,2017,3248,7265,6954,2971,1858,1469,1272,-7,-80 1432,1597, 2017,3248,7265,6954,2971,1858,1469,1272,-6,-40 1431,1591,2013,3247,7265,6948,2974,1863,1476,1298,-5,-50 1431,1591, 2013,3247,7265,6948,2974,1863,1476,1298,-4,-40 1428,1587,2008,3240,7263,6947,2982,1870,1483,1326,-3,-606954,2971,1858,1469,1272,-7,-80 1432,1597,2017,3248,7265,6954,2971,1858,1469,1272,-6,-40 1431,1591,2013,3247,7265, 6948,2974,1863,1476,1298,-5,-50 1431,1591,2013,3247,7265,6948,2974,1863,1476,1298,-4,-40 1428,1587,2008,3240,7263, 6947,2982,1870,1483,1326,-3,-606954,2971,1858,1469,1272,-7,-80 1432,1597,2017,3248,7265,6954,2971,1858,1469,1272,-6,-40 1431,1591,2013,3247,7265, 6948,2974,1863,1476,1298,-5,-50 1431,1591,2013,3247,7265,6948,2974,1863,1476,1298,-4,-40 1428,1587,2008,3240,7263, 6947,2982,1870,1483,1326,-3,-60

4

3 回答 3

0

如果您正确划分输入并且错误没有下降,则可能是算法问题。反向传播有时会卡住,您可以尝试弹性传播,它可以更好地扩展自身并且可以避免卡住。

于 2013-02-22T19:43:39.003 回答
0

您应该尝试添加更多的第 2 层隐藏神经元。

您还应该解释您的训练算法,最好使用示例输入和输出。

对于输出,我将它们映射到 -1 和 1,除以 100。

你除以 100 是多少?为什么输出范围使用 -1 到 1 而不是 0 到 1?

于 2012-11-19T20:22:49.457 回答
0

我认为更多的第 2 层隐藏神经元并没有改变结果。恕我直言,添加另一个隐藏层也有 7 个神经元。

于 2012-11-19T20:24:32.780 回答