tvm安装教程 及 LLVM基础学习:LLVM的编译安装和基本使用

tvm编译:

https://blog.csdn.net/u010420283/article/details/134635586

 

llvm编译:

https://www.cnblogs.com/robotech/p/16370415.html

 

注意版本问题:

tvm 13.0  llvm 13.0.0 (本地测试通过)

tvm下载地址:https://tvm.apache.org/download

llvm下载地址:  https://releases.llvm.org/download.html

安装参考文档:

https://blog.csdn.net/gasolinesky/article/details/130091169  

及tvm中文站安装教程: https://tvm.hyper.ai/docs/install/from_source/

 

windows安装 tvm:

https://blog.csdn.net/wsp_1138886114/article/details/135123205

https://blog.csdn.net/weixin_50836014/article/details/127512029     // 其中包含llvm的安装路径:  https://codeload.github.com/llvm/llvm-project/zip/refs/tags/llvmorg-13.0.0

报语法错误,重装vs;

 测试可用例子:

https://blog.csdn.net/qq_32460819/article/details/109244214

错误问题汇总:

https://blog.csdn.net/JerryLiu1998/article/details/108407804

 

【从零开始学深度学习编译器】五,TVM Relay以及Pass简介

https://blog.csdn.net/just_sort/article/details/116355215

基于Relay构建一个自定义的神经网络示例

我们基于Relay的接口定义一个Conv+BN+ReLU的小网络,展示一下Relay接口应该如何使用,这里TVM版本是0.8.0.dev,代码如下:

#coding=utf-8
import tvm
from tvm import relay
import numpy as np
from tvm.contrib import graph_executor

# 构造BN
def batch_norm(data,
                     gamma=None,
                     beta=None,
                     moving_mean=None,
                     moving_var=None,
                     **kwargs):
    name = kwargs.get("name")
    kwargs.pop("name")
    if not gamma:
        gamma = relay.var(name + "_gamma")
    if not beta:
        beta = relay.var(name + "_beta")
    if not moving_mean:
        moving_mean = relay.var(name + "_moving_mean")
    if not moving_var:
        moving_var = relay.var(name + "_moving_var")
    return relay.nn.batch_norm(data,
                               gamma=gamma,
                               beta=beta,
                               moving_mean=moving_mean,
                               moving_var=moving_var,
                               **kwargs)[0]

# 构造卷积
def conv2d(data, weight=None, **kwargs):
    name = kwargs.get("name")
    kwargs.pop("name")
    if not weight:
        weight = relay.var(name + "_weight")
    return relay.nn.conv2d(data, weight, **kwargs)


# 构造卷积+BN+ReLU的simpleNet
def simplenet(data, name, channels, kernel_size=(3, 3), strides=(1, 1),
               padding=(1, 1), epsilon=1e-5):
    conv = conv2d(
        data=data,
        channels=channels,
        kernel_size=kernel_size,
        strides=strides,
        padding=padding,
        data_layout='NCHW',
        name=name+'_conv')
    bn = batch_norm(data=conv, epsilon=epsilon, name=name + '_bn')
    act = relay.nn.relu(data=bn)
    return act

data_shape = (1, 3, 224, 224)
kernel_shape = (32, 3, 3, 3)
dtype = "float32"
data = relay.var("data", shape=data_shape, dtype=dtype)
act = simplenet(data, "graph", 32, strides=(2, 2))
func = relay.Function(relay.analysis.free_vars(act), act)

print(func)

np_data = np.random.uniform(-1, 1, (1, 3, 224, 224))

params = {
    "graph_conv_weight": tvm.nd.array(np.random.uniform(-1, 1, (32, 3, 3, 3)).astype(dtype)),
    "graph_bn_gamma": tvm.nd.array(np.random.uniform(-1, 1, (32)).astype(dtype)),
    "graph_bn_beta": tvm.nd.array(np.random.uniform(-1, 1, (32)).astype(dtype)),
    "graph_bn_moving_mean": tvm.nd.array(np.random.uniform(-1, 1, (32)).astype(dtype)),
    "graph_bn_moving_var": tvm.nd.array(np.random.uniform(-1, 1, (32)).astype(dtype)),
}

with tvm.transform.PassContext(opt_level=3):
    lib = relay.build(func, "llvm", params=params)

dev = tvm.cpu(0)
dtype = "float32"
m = graph_executor.GraphModule(lib["default"](dev))
# set inputs
m.set_input("data", tvm.nd.array(np_data.astype(dtype)))
# execute
m.run()
# get outputs
tvm_output = m.get_output(0)

  

 

posted @ 2024-04-02 21:56  小丑_jk  阅读(11)  评论(0编辑  收藏  举报