关于tensorboard笔记
------------恢复内容开始------------
关于tensorboard:
它的使用是在训练模型之后的分析模型结果阶段。作用是将cnn度量可视化,使我们可以更清楚地了解这个训练过程中发生了什么事情。具体而言:
- Tracking and visualizing metrics such as loss and accuracy
- Visualizing the model graph (ops and layers)
- Viewing histograms of weights, biases, or other tensors as they change over time
- Projecting embeddings to a lower dimensional space
- Displaying images, text, and audio data
- Profiling TensorFlow programs
- And much more
关于使用:
pip install tensorboard(这个版本要在1.15及以上)
Pytorch的1.1.0版本就加入了tensorboard的实用程序包:
from torch.utils.tensorboard import SummaryWriter(SummaryWriter是一个类)
To use TensorBoard our task is to get the data we want displayed saved to a file that TensorBoard can read.
创建一个SummaryWriter实例、PyTorch网络实例并对一组的images和labels进行分析。之后图像和网络图都被加进了Tensorboard.
tb = SummaryWriter() network = Network() images, labels = next ( iter (train_loader)) grid = torchvision.utils.make_grid(images) tb.add_image( 'images' , grid) tb.add_graph(network, images) tb.close()
Running TensorBoard:(运行.py文件,会在.py文件所在的目录下生成一个runs文件夹)
默认SummaryWriter实体会把数据写入磁盘中的./runs目录下,运行命令时通过传参告诉数据位置:
tensorboard --logdir=runs
访问TensorBoard接口:
http://localhost:6006
(这时可以看到的式网络图和图片数据)
在终端cd 到events.out文件所在的上上级目录,然后tensorboard --logdir=上级目录
然后通过谷歌点开访问接口
http://localhost:6006/
TensorBoard Histograms And Scalars:
我们可以添加到TensorBoard的下一个导入类型是数值数据。我们可以添加将随时间或epoch显示的标量值。我们还可以在直方图中添加值,以查看值的频率分布。
例子:
tb.add_scalar('Loss', total_loss, epoch)
tb.add_scalar('Number Correct', total_correct, epoch)
tb.add_scalar('Accuracy', total_correct / len(train_set), epoch)
tb.add_histogram('conv1.bias', network.conv1.bias, epoch)
tb.add_histogram('conv1.weight', network.conv1.weight, epoch)
tb.add_histogram('conv1.weight.grad', network.conv1.weight.grad, epoch)
在训练循环体中放置上面的calls:
1 network = Network() 2 3 train_loader = torch.utils.data.DataLoader(train_set, batch_size=100) 4 5 optimizer = optim.Adam(network.parameters(), lr=0.01) 6 7 8 9 images, labels = next(iter(train_loader)) 10 11 grid = torchvision.utils.make_grid(images) 12 13 14 15 tb = SummaryWriter() 16 17 tb.add_image('images', grid) 18 19 tb.add_graph(network, images) 20 21 for epoch in range(1): 22 23 24 25 total_loss = 0 26 27 total_correct = 0 28 29 30 31 for batch in train_loader: # Get Batch 32 33 34 35 # Pass Batch 36 37 # Calculate Loss 38 39 # Calculate Gradient 40 41 # Update Weights 42 43 44 45 tb.add_scalar('Loss', total_loss, epoch) 46 47 tb.add_scalar('Number Correct', total_correct, epoch) 48 49 tb.add_scalar('Accuracy', total_correct / len(train_set), epoch) 50 51 52 53 tb.add_histogram('conv1.bias', network.conv1.bias, epoch) 54 55 tb.add_histogram('conv1.weight', network.conv1.weight, epoch) 56 57 tb.add_histogram( 58 59 'conv1.weight.grad' 60 61 ,network.conv1.weight.grad 62 63 ,epoch 64 65 ) 66 67 68 69 print( 70 71 "epoch", epoch, 72 73 "total_correct:", total_correct, 74 75 "loss:", total_loss 76 77 ) 78 79 80 81 tb.close()
加入到Tensorboard中的这些值还会随着玩过训练实时更新。
TensorBoard的真正强大之处在于它能够对多次运行进行比较。这允许我们通过改变超参数值和比较运行来快速试验,以查看哪些参数工作得最好。
另一个例子:(来自参考https://blog.csdn.net/wuzhihuaw/article/details/121357355)
import numpy as np
from torch.utils.tensorboard import SummaryWriter
writer = SummaryWriter(comment = 'tensorboard_test' )
for x in range ( 50 ):
writer.add_scalar( 'y=2x' , x * 2 , x)
writer.add_scalar( 'y=pow(2, x)' , 2 * * x, x)
writer.add_scalars( 'data/scalar_group' , { "xsinx" : x * np.sin(x),
"xcosx" : x * np.cos(x),
"arctanx" : np.arctan(x)}, x)
writer.close()
------------恢复内容结束------------

浙公网安备 33010602011771号