Skip to content

Commit

Permalink
Update onnxruntime/core/providers/tensorrt/tensorrt_execution_provide…
Browse files Browse the repository at this point in the history
…r.cc

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
  • Loading branch information
jingyanwangms and github-actions[bot] authored Jan 31, 2025
1 parent afe0a26 commit 169e13d
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -2497,7 +2497,7 @@ TensorrtExecutionProvider::GetCapability(const GraphViewer& graph,
const auto& node = graph.GetNode(node_index[i]);
const bool is_context_node = node && !node->OpType().empty() && node->OpType() == "EPContext";
if (is_context_node) {
SubGraph_t supported_node_vector (std::vector<long unsigned int>{i}, true);
SubGraph_t supported_node_vector(std::vector<long unsigned int>{i}, true);

Check warning on line 2500 in onnxruntime/core/providers/tensorrt/tensorrt_execution_provider.cc

View workflow job for this annotation

GitHub Actions / Optional Lint C++

[cpplint] reported by reviewdog 🐶 Use int16_t/int64_t/etc, rather than the C type long [runtime/int] [4] Raw Output: onnxruntime/core/providers/tensorrt/tensorrt_execution_provider.cc:2500: Use int16_t/int64_t/etc, rather than the C type long [runtime/int] [4]
std::unique_ptr<IndexedSubGraph> sub_graph = GetSubGraph(supported_node_vector, graph, model_hash, subgraph_idx++);
result.push_back(ComputeCapability::Create(std::move(sub_graph)));
}
Expand Down

0 comments on commit 169e13d

Please sign in to comment.