Flask部署TensorRT问题解决(pycuda._driver.LogicError: explicit_context_dependent failed)
问题描述:今天使用Flask搭建服务端进行模型推理,在使用TensorRT进行推理模型时出现如下错误:line 39, in allocate_buffers stream = cuda.Stream() # pycuda 操作缓冲区pycuda._driver.LogicError: explicit_context_dependent failed: invalid device context - no currently active context?网上查了,说是...