1

我想得到一个比 Weka 更快的随机森林分类器,所以我只是尝试了Shark(我不能使用像 wiseRF 这样的商业分类器)。我知道 Weka 上有一个替代的 RF 分类器,但我希望使用这个著名的 C++ 快速库获得更好的结果。我的第一个结果是:

Training time: Weka > 170s VS Shark > 168s
Prediction results on the same test set: Weka > 90,1% correctly classified VS Shark > accuracy of 0.510824 (!!!!)

这听起来很疯狂,所以我相信一定有很多东西可以改善这些结果。

我得到推荐的代码来使用这样的鲨鱼:

svn co https://svn.code.sf.net/p/shark-project/code/trunk/Shark
sudo apt-get install cmake cmake-curses-gui libatlas-base-dev libboost-all-dev

我使用建议的选项成功运行了 cmake:

cmake -DOPT_ENABLE_ATLAS=ON -DOPT_ENABLE_OPENMP=ON

我使用了文档“RFTutorial.cpp”中的基本示例并添加了“trainer.setNTrees(100);” 为了获得与我的 Weka 测试中相同数量的森林。

我使用了建议的基本 Makefile 并添加了:

SHARK_ROOT = /home/doxav/Shark
CPPFLAGS   =  -I${BOOST_ROOT} -I${SHARK_ROOT}/include -Wall
CPPFLAGS  += -DNDEBUG -DBOOST_UBLAS_NDEBUG  -O3
LDFLAGS   += -L${SHARK_ROOT}/lib/ -L${BOOST_ROOT}/lib
LDLIBS     = -lshark -lgomp
LDLIBS    += -lboost_serialization -lboost_system -lboost_filesystem -lboost_program_options
CC         =  g++

运行我的 makefile 时,我得到以下 g++ 命令行:

g++  -I -I/home/xavier/Shark/include -Wall -DNDEBUG -DBOOST_UBLAS_NDEBUG  -O3 -L/home/xavier/Shark/lib/ -L/lib  RFTest.cpp  -lshark -lgomp -lboost_serialization -lboost_system -lboost_filesystem -lboost_program_options -o RFTest

我必须从 Weka 中使用的文件中调整我的 CSV 文件,以使其在 Shark 上工作,因为它显然不接受字符串:

Weka =>
225,#225,138.6,-648,225,0.410451,#2,0,0,0.0256,0.0256,0.15411,?,?,0.045524,0.006503,0.002223,0.782222,1.328889,?,1.017778,0.617778,0,-11,?,-6,-5,176,116,-1430,0,0,0.170455,0.170455,0.136174,?,?,0.041649,0.00595,0.001192,299,269,-659,0,0,0.006689,-0.143509,0.23395,?,?,0.015899,-0.005781,0.002956,?,?,?,?,?,?,?,?,?,?,?,?,?,229,139,-653,0,0,0.026201,0.026201,0.093029,?,?,0.047562,0.006795,0.000937,139,79,-13945,0,0,0,0,-0.094604,?,?,?,?,0.001049,#225

Shark (I removed ? (used for unknown values) and # (I use it to force Weka to interpret some numeric values as nominal out of the box)) =>
225,225,138.6,-648,225,0.410451,2,0,0,0.0256,0.0256,0.15411,,,0.045524,0.006503,0.002223,0.782222,1.328889,,1.017778,0.617778,0,-11,,-6,-5,176,116,-1430,0,0,0.170455,0.170455,0.136174,,,0.041649,0.00595,0.001192,299,269,-659,0,0,0.006689,-0.143509,0.23395,,,0.015899,-0.005781,0.002956,,,,,,,,,,,,,,229,139,-653,0,0,0.026201,0.026201,0.093029,,,0.047562,0.006795,0.000937,139,79,-13945,0,0,0,0,-0.094604,,,,,0.001049,225
4

0 回答 0