Re-evaluate Generics Usage in Tensor Design
Currently, the use of generics in the tensor implementation allows the direct specification of data types without the need to pass dtype arguments, leading to clearer code about what data type is used. However, this design presents challenges: once precision is inferred and fixed via generics, it cannot be changed easily. This impacts tensor precision conversion functionality—changing precision is difficult or may break the computation graph due to incompatible types.
Request
- Assess whether generics are the optimal solution for tensor typing in the current framework
- Explore alternative designs that allow runtime precision conversion without breaking graph semantics or unnecessarily constraining flexibility
- Document issues with current generics-based design, especially regarding precision compatibility and conversion
- Propose improvements or alternatives based on your evaluation
Note: The evaluation should weigh pros and cons of generics, impacts on performance, conversion complexity, API clarity, and overall maintainability.