问题标签 [fann]
For questions regarding programming in ECMAScript (JavaScript/JS) and its various dialects/implementations (excluding ActionScript). Note JavaScript is NOT the same as Java! Please include all relevant tags on your question; e.g., [node.js], [jquery], [json], [reactjs], [angular], [ember.js], [vue.js], [typescript], [svelte], etc.
python - Lib Fann2 安装失败
我想安装一个 lib(快速人工神经网络库(FANN))以在 python(3.7 版)中使用。但是,一些例外是停止导入过程。贝娄,是我的命令和例外:
命令:(pip install fann2
如pypi .org 所示)
例外:
使用缓存的https://files.pythonhosted.org/packages/80/a1/fed455d25c34a62d4625254880f052502a49461a5dd1b80854387ae2b25f/fann2-1.1.2.tar.gz收集 fann2
命令 python setup.py egg_info 的完整输出:正在寻找 FANN 库...
Traceback(最近一次调用最后一次):文件“”,第 1 行,在
第 92 行,在 build_swig()
第 85 行,在 build_swig find_fann()
第 66 行,在 find_fann 中引发异常(“找不到 FANN 源库!”)异常:找不到 FANN 源库!
命令“python setup.py egg_info”在 C:\mypath\pip-install-ilv2i8yp\fann2\ 中失败,错误代码为 1
我的操作系统是 Windwos 10。有人可以帮助我吗?提前致谢。
php - 是否可以使用 FANN 制作 RNN
我在玩 PHP-CLI 和 MQL4。我想为外汇预测创建一个神经网络。
是否可以使用 FANN 库制作具有门控循环单元 (GRU) 或长短期记忆 (LTSM) 的循环神经网络 (RNN)?
c++ - fann_set_bit_fail_limit() 未能实际设置位失败限制,迫使我手动编辑 ANN 文件
每当我从文件中加载这个神经网络并读取fann_get_bit_fail_limit()
时,返回的值是默认的 0.35 而不是我尝试设置的 0.002。我必须手动编辑 ANN 文件才能使其生效。
我正在使用doublefann.h
. 难道我做错了什么?
c++ - FANN:使用从多个文件读取的数据训练 ANN 时的内存泄漏
我有以下循环:
ann
是网络。
batchFiles
是一个std::vector<std::filesystem::path>
。
此代码遍历文件夹中的所有训练数据文件,并每次使用它来训练 ANN,次数由epochs
变量确定。
以下行导致内存泄漏:
struct fann_train_data *data = fann_read_train_from_file(it->string().c_str());
问题是我必须不断地在训练文件之间切换,因为我没有足够的内存来一次加载它们,否则我只会加载一次训练数据。
为什么会这样?我该如何解决这个问题?
c++ - Training NN to calculate atan2(y, x)
I've been working on a Q reinforcement learning implementation, where Q(π, a) is is approximated with a neural network. During trouble-shooting, I reduced the problem down to a very simple first step: train a NN to calculate atan2(y, x).
I'm using FANN for this problem, but the library is largely irrelevant as this question is more about the appropriate technique to use.
I have been struggling to teach the NN, given input = {x, y}, to calculate output = atan2(y, x).
Here is the naïve approach I have been using. It's extremely simplistic, but I'm trying to keep this simple to work up.
Super simple, right? However, even after 100,000,000 iterations this neural network fails:
Testing atan2(0.949040, 0.756997) = 0.897493 -> 0.987712
I also tried using a linear activation function on the output layer (FANN_LINEAR
). No luck. In fact, the results are much worse. After 100,000,000 iterations, we get:
Testing atan2(0.949040, 0.756997) = 0.897493 -> 7.648625
Which is even worse than when the weights were randomly initialized. How could a NN get worse after training?
I found this issue with FANN_LINEAR
to be consistent with other tests. When linear output is needed (e.g. in the calculation of the Q value, which corresponds to arbitrarily large or small rewards), this approach fails miserably and error actually appears to increase with training.
So what is going on? Is using a fully-connected 2-3-1 NN inappropriate for this situation? Is a symmetric sigmoid activation function in the hidden layer inappropriate? I fail to see what else could possibly account for this error.
laravel - 在 Homestead 上安装 PHP-FANN,找不到 fann.so
我通过以下命令在 laravel/homestead 版本 '8.1.0' 上安装了 PHP-FANN:
之后 fann.so 应该添加到 php 扩展目录,但我找不到它。谁能帮我?
php - 我创建并训练了一个 PHP-FANN,但我没有得到想要的结果或准确性
我在 geekgirljoy 的一些示例和教程的帮助下在 PHP 中创建了一个 FANN, 并基于php-fann-repo中的 ocr 示例
我正在尝试创建一个系统,它根据订单号告诉我这是哪种类型的订单。
我已经创建了训练数据,对其进行了训练和测试,但无法得到我期望的结果。我现在处于随机更改参数不再有帮助的地步,我不确定我的假设一开始是否正确。
一些训练数据:我得到了 60k 行的间隔分割的二进制序号
培训文件:
我的训练脚本:
我的测试脚本: