11

最后我设法从一个文件中训练了一个网络:) 现在我想打印节点和权重,尤其是权重,因为我想用 pybrain 训练网络,然后在其他地方实现一个 NN 将使用它。

我需要一种方法来打印层、节点和节点之间的权重,以便我可以轻松地复制它。到目前为止,我看到我可以使用例如 n['in'] 访问图层,然后例如我可以这样做:

dir(n['in']) [' class ',' delattr ',' dict ',' doc ',' format ',' getattribute ',' hash ',' init ',' module ',' new ', ' reduce '、' reduce_ex '、' repr '、' setattr '、' sizeof '、' str '、' subclasshook '、' weakref','_backwardImplementation','_forwardImplementation','_generateName','_getName','_growBuffers','_name','_nameIds','_resetBuffers','_setName','activate','activateOnDataset','argdict', 'backActivate'、'backward'、'bufferlist'、'dim'、'forward'、'getName'、'indim'、'inputbuffer'、'inputerror'、'name'、'offset'、'outdim'、'outputbuffer '、'outputerror'、'paramdim'、'reset'、'sequential'、'setArgs'、'setName'、'shift'、'whichNeuron']

但我看不到如何在这里访问权重。还有 params 属性,例如我的网络是 2 4 1 有偏差,它说:

n.params array([-0.8167133 , 1.00077451, -0.7591257 , -1.1150532 , -1.58789386, 0.11625991, 0.98547457, -0.99397871, -1.8324281 , -2.42200963, 1.90617387, 1.93741167, -2.88433965, 0.27449852, -1.52606976, 2.39446258, 3.01359547])

很难说什么是什么,至少用权重连接了哪些节点。这就是我所需要的。

4

3 回答 3

21

有许多方法可以访问网络的内部,即通过其“模块”列表或“连接”字典。参数存储在这些连接或模块中。例如,下面应该打印任意网络的所有这些信息:

for mod in net.modules:
    print("Module:", mod.name)
    if mod.paramdim > 0:
        print("--parameters:", mod.params)
    for conn in net.connections[mod]:
        print("-connection to", conn.outmod.name)
        if conn.paramdim > 0:
             print("- parameters", conn.params)
    if hasattr(net, "recurrentConns"):
        print("Recurrent connections")
        for conn in net.recurrentConns:
            print("-", conn.inmod.name, " to", conn.outmod.name)
            if conn.paramdim > 0:
                print("- parameters", conn.params)

如果您想要更细粒度的东西(在神经元级别而不是层级别),您将不得不进一步分解这些参数向量 - 或者,从单神经元层构建您的网络。

于 2011-11-17T02:12:47.567 回答
11

Try this, it worked for me:

def pesos_conexiones(n):
    for mod in n.modules:
        for conn in n.connections[mod]:
            print conn
            for cc in range(len(conn.params)):
                print conn.whichBuffers(cc), conn.params[cc]

The result should be like:

<FullConnection 'co1': 'hidden1' -> 'out'>
(0, 0) -0.926912942354
(1, 0) -0.964135087592
<FullConnection 'ci1': 'in' -> 'hidden1'>
(0, 0) -1.22895643048
(1, 0) 2.97080368887
(2, 0) -0.0182867906276
(3, 0) 0.4292544603
(4, 0) 0.817440427069
(0, 1) 1.90099230604
(1, 1) 1.83477578625
(2, 1) -0.285569867513
(3, 1) 0.592193396226
(4, 1) 1.13092061631
于 2012-10-10T21:57:31.737 回答
3

也许这有帮助(PyBrain for Python 3.2)?

C:\tmp\pybrain_examples>\Python32\python.exe
Python 3.2 (r32:88445, Feb 20 2011, 21:29:02) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> from pybrain.tools.shortcuts import buildNetwork
>>> from pybrain.structure.modules.tanhlayer import TanhLayer
>>> from pybrain.structure.modules.softmax import SoftmaxLayer
>>>
>>> net = buildNetwork(4, 3, 1,bias=True,hiddenclass = TanhLayer, outclass =   SoftmaxLayer)
>>> print(net)
FeedForwardNetwork-8
Modules:
[<BiasUnit 'bias'>, <LinearLayer 'in'>, <TanhLayer 'hidden0'>, <SoftmaxLayer 'out'>]
Connections:
[<FullConnection 'FullConnection-4': 'hidden0' -> 'out'>, <FullConnection   'FullConnection-5': 'bias' -> 'out'>, <FullConnection
'FullConnection-6': 'bias' -> 'hidden0'>, <FullConnection 'FullConnection-7': 'in' -> 'hidden0'>]
于 2012-02-12T11:37:35.487 回答