Conversation
| @@ -258,12 +263,13 @@ def aten_addmv( | |||
|
|
|||
| @torch_op("aten::addr", traceable=True) | |||
There was a problem hiding this comment.
traceonly etc.
❌ 59 Tests Failed:
View the top 3 failed tests by shortest run time
To view more test analytics, go to the Test Analytics Dashboard |
| @@ -0,0 +1,68 @@ | |||
| """Type promotion functions for op implementations.""" | |||
Check warning
Code scanning / lintrunner
RUFF/CPY001 Warning
| @@ -0,0 +1,68 @@ | |||
| """Type promotion functions for op implementations.""" | |||
Check warning
Code scanning / lintrunner
RUFF/format Warning
| @@ -0,0 +1,68 @@ | |||
| """Type promotion functions for op implementations.""" | |||
Check warning
Code scanning / lintrunner
RUFF-FORMAT/format Warning
| @@ -0,0 +1,68 @@ | |||
| """Type promotion functions for op implementations.""" | |||
|
|
|||
| from typing import Sequence | |||
Check warning
Code scanning / lintrunner
RUFF/I001 Warning
|
Should we merge a basic version before moving the whole thing to PyTorch so that we don't need to start from scratch? |
Sure! We can do that |
| raise ValueError(f"Unexpected data types: {a}, {b}") | ||
|
|
||
|
|
||
| def promote_types(op, values: Sequence[ir.Value]) -> Sequence[ir.Value]: |
There was a problem hiding this comment.
If it is used only for binary operators, I would add it to the function name just to make sure it is not called in any other operator.
Implement PyTorch type promotion for torchlib functions. Reference https://github.com/pytorch/pytorch/blob/bdd942efd76e74baa5dd0a262f7c843ddfe2e11b/torch/_prims_common/__init__.py#L1160