-
Notifications
You must be signed in to change notification settings - Fork 796
Open
Labels
module: exirIssues related to Export IR and the code under exir/Issues related to Export IR and the code under exir/
Description
📚 The doc issue
We do not specify the IR for placeholders/graph inputs that delegates need to support.
For example, today I think placeholders will usually be tensors, with floats and ints inputs being lifted to rank0 tensors. But raw symints also show up as placeholders in the presence of graph breaks. For example,
Lowered module 0:
def forward(...):
sym_size: "Sym(s53)" = torch.ops.aten.sym_size.int(attention_mask, 1); attention_mask = None
return [sym_size, aten_expand_copy_default, aten_add_tensor_1]
Lowered module 1:
def forward(aten_embedding_default_2: "f32[1, s53, 768]", sym_size: "Sym(s53)"):
aten_view_copy_default_3: "f32[1, s53, 768]" = executorch_exir_dialects_edge__ops_aten_view_copy_default(aten_permute_copy_default_3, [1, sym_size, 768]);
We should clarify placeholder IR that delegates are expected to support.
Suggest a potential alternative/fix
No response
Metadata
Metadata
Assignees
Labels
module: exirIssues related to Export IR and the code under exir/Issues related to Export IR and the code under exir/
Type
Projects
Status
To triage