3

所以我只是做了一个tune实验,得到了以下输出:

+--------------------+------------+-------+-------------+----------------+--------+------------+
| Trial name         | status     | loc   |          lr |   weight_decay |   loss |   accuracy |
|--------------------+------------+-------+-------------+----------------+--------+------------|
| trainable_13720f86 | TERMINATED |       | 0.00116961  |     0.00371219 | 0.673  |     0.7977 |
| trainable_13792744 | TERMINATED |       | 0.109529    |     0.0862344  | 0.373  |     0.8427 |
| trainable_137ecd98 | TERMINATED |       | 4.35062e-06 |     0.0261442  | 0.6993 |     0.7837 |
| trainable_1383f9d0 | TERMINATED |       | 1.37858e-05 |     0.0974182  | 0.4538 |     0.8428 |
| trainable_13892f72 | TERMINATED |       | 0.0335583   |     0.0403495  | 0.3399 |     0.8618 |
| trainable_138dd720 | TERMINATED |       | 0.00858623  |     0.0695453  | 0.3415 |     0.8612 |
| trainable_1395570c | TERMINATED |       | 4.6309e-05  |     0.0172459  | 0.39   |     0.8283 |
| trainable_139ce148 | TERMINATED |       | 2.32951e-05 |     0.0787076  | 0.3641 |     0.8512 |
| trainable_13a848ee | TERMINATED |       | 0.00431763  |     0.0341105  | 0.3415 |     0.8611 |
| trainable_13ad0a78 | TERMINATED |       | 0.0145063   |     0.050807   | 0.3668 |     0.8398 |
| trainable_13b3342a | TERMINATED |       | 5.96148e-06 |     0.0110345  | 0.3418 |     0.8608 |
| trainable_13bd4d3e | TERMINATED |       | 1.82617e-06 |     0.0655128  | 0.3667 |     0.8501 |
| trainable_13c45a2a | TERMINATED |       | 0.0459573   |     0.0224991  | 0.3432 |     0.8516 |
| trainable_13d561d0 | TERMINATED |       | 0.00060595  |     0.092522   | 0.3389 |     0.8623 |
| trainable_13dcb962 | TERMINATED |       | 0.000171044 |     0.0449039  | 0.3429 |     0.8584 |
| trainable_13e6fd32 | TERMINATED |       | 0.000104752 |     0.089106   | 0.3497 |     0.8571 |
| trainable_13ecd2ac | TERMINATED |       | 0.000793432 |     0.0477341  | 0.6007 |     0.8051 |
| trainable_13f27464 | TERMINATED |       | 0.0750381   |     0.0685323  | 0.3359 |     0.8616 |
| trainable_13f80b40 | TERMINATED |       | 1.3946e-06  |     0.0192844  | 0.5615 |     0.8146 |
| trainable_13fdf6e0 | TERMINATED |       | 9.4748e-06  |     0.0542356  | 0.3546 |     0.8493 |
+--------------------+------------+-------+-------------+----------------+--------+------------+

但是当我查看单个结果时,我发现对于第三次试验trainable_137ecd98

截图来自 2020-08-24 11-05-33

如果我想检查点并报告给定试验达到的最高精度(或最佳其他指标),用户是否打算跟踪best_metric每次试验,并在best_metric更新时编写自定义检查点?

我看到有一个checkpoint_at_end选项tune.run,但最常见的用例不是checkpoint_if_best因为试验的最后一次训练迭代很少是最好的吗?

谢谢!

4

1 回答 1

4

如果您只想为每个试验保留 1 个最佳检查点,您可以这样做

tune.run(keep_checkpoints_num=1, checkpoint_score_attr="accuracy")

如果您想保留多个检查点,但希望在实验结束后获得最好的一个,您可以执行以下操作:

analysis = tune.run(...)
# Gets best trial based on max accuracy across all training iterations.
best_trial = analysis.get_best_trial(metric="accuracy", mode="max", scope="all") 
# Gets best checkpoint for trial based on accuracy.
best_checkpoint = analysis.get_best_checkpoint(best_trial, metric="accuracy")
于 2020-08-24T18:34:03.827 回答