-
Notifications
You must be signed in to change notification settings - Fork 10
Update mlir-aie from v1.1.3 to v1.1.4 #57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
📊 Test Results for Test Example Applicationsfc8fd1c (2025_12_18_21_35_57) IRONCLADTested on
📈 Trends (vs main branch) for Test Example Applicationsfc8fd1c (2025_12_18_21_35_57) IRONCLAD Trendsllama_3.2_1b
llama_3.2_1b_prompt_13_tokens_1
llama_3.2_1b_prompt_13_tokens_40
llama_3.2_1b_prompt_2048_tokens_1
llama_3.2_1b_prompt_2048_tokens_40
|
📊 Test Results for Small Benchmark/Test Suitefc8fd1c (2025_12_18_21_45_13) IRONCLADTested on
📈 Trends (vs main branch) for Small Benchmark/Test Suitefc8fd1c (2025_12_18_21_45_13) IRONCLAD Trendsaxpy_1_cols_2_channels_2048_tile_2048_3.0
axpy_1_cols_2_channels_2048_tile_2048_3.0_0
axpy_2_cols_2_channels_2048_tile_1024_3.0
axpy_2_cols_2_channels_2048_tile_1024_3.0_0
axpy_4_cols_2_channels_2048_tile_512_3.0
axpy_4_cols_2_channels_2048_tile_512_3.0_0
axpy_8_cols_2_channels_2048_tile_256_3.0
axpy_8_cols_2_channels_2048_tile_256_3.0_0
dequant_1_cols_1_channels_2048_tile_2048
dequant_1_cols_1_channels_2048_tile_2048_0
dequant_1_cols_2_channels_2048_tile_1024
dequant_1_cols_2_channels_2048_tile_1024_0
dequant_2_cols_1_channels_2048_tile_1024
dequant_2_cols_1_channels_2048_tile_1024_0
dequant_2_cols_2_channels_2048_tile_512
dequant_2_cols_2_channels_2048_tile_512_0
dequant_4_cols_1_channels_2048_tile_512
dequant_4_cols_1_channels_2048_tile_512_0
dequant_4_cols_2_channels_2048_tile_256
dequant_4_cols_2_channels_2048_tile_256_0
dequant_8_cols_1_channels_2048_tile_256
dequant_8_cols_1_channels_2048_tile_256_0
dequant_8_cols_2_channels_2048_tile_128
dequant_8_cols_2_channels_2048_tile_128_0
eltwise_add_1_cols_2_channels_2048_tile_2048
eltwise_add_2_cols_2_channels_2048_tile_1024
eltwise_add_4_cols_2_channels_2048_tile_512
eltwise_add_8_cols_2_channels_2048_tile_256
eltwise_mul_1_cols_2_channels_2048_tile_2048
eltwise_mul_2_cols_2_channels_2048_tile_1024
eltwise_mul_4_cols_2_channels_2048_tile_512
eltwise_mul_8_cols_2_channels_2048_tile_256
gelu_1_cols_1_channels_2048_tile_2048
gelu_1_cols_2_channels_2048_tile_1024
gelu_2_cols_1_channels_2048_tile_1024
gelu_2_cols_2_channels_2048_tile_512
gelu_4_cols_1_channels_2048_tile_512
gelu_4_cols_2_channels_2048_tile_256
gelu_8_cols_1_channels_2048_tile_256
gelu_8_cols_2_channels_2048_tile_128
gemm_2048x2048x2048_64x64x32_8_cols_0_bcolmaj_0_ccolmaj_0
gemm_2048x2048x2048_64x64x32_8_cols_0_bcolmaj_1_ccolmaj_0
gemm_2048x2048x2048_64x64x32_8_cols_1_bcolmaj_0_ccolmaj_0
gemm_2048x2048x2048_64x64x64_2_cols_0_bcolmaj_0_ccolmaj_0
gemm_2048x2048x2048_64x64x64_2_cols_0_bcolmaj_0_ccolmaj_0_0
gemm_2048x2048x2048_64x64x64_2_cols_0_bcolmaj_1_ccolmaj_0
gemm_2048x2048x2048_64x64x64_2_cols_0_bcolmaj_1_ccolmaj_0_0
gemm_2048x2048x2048_64x64x64_2_cols_1_bcolmaj_0_ccolmaj_0
gemm_2048x2048x2048_64x64x64_2_cols_1_bcolmaj_0_ccolmaj_0_0
gemm_2048x2048x2048_64x64x64_8_cols_0_bcolmaj_0_ccolmaj_0
layer_norm_1_cols_1_channels_2048_tile_2048
layer_norm_1_cols_2_channels_2048_tile_1024
layer_norm_2_cols_1_channels_2048_tile_1024
layer_norm_2_cols_2_channels_2048_tile_512
layer_norm_4_cols_1_channels_2048_tile_512
layer_norm_4_cols_2_channels_2048_tile_256
layer_norm_8_cols_1_channels_2048_tile_256
layer_norm_8_cols_2_channels_2048_tile_128
matrix_vector_mul_128x128_32_1col
matrix_vector_mul_128x128_32_1col0
matrix_vector_mul_2048x8192_1_1col
matrix_vector_mul_2048x8192_1_1col0
matrix_vector_mul_2048x8192_1_2col
matrix_vector_mul_2048x8192_1_2col0
matrix_vector_mul_2048x8192_1_4col
matrix_vector_mul_2048x8192_1_4col0
matrix_vector_mul_2048x8192_1_8col
matrix_vector_mul_2048x8192_1_8col0
matrix_vector_mul_8192x2048_4_1col
matrix_vector_mul_8192x2048_4_1col0
matrix_vector_mul_8192x2048_4_2col
matrix_vector_mul_8192x2048_4_2col0
matrix_vector_mul_8192x2048_4_4col
matrix_vector_mul_8192x2048_4_4col0
matrix_vector_mul_8192x2048_4_8col
matrix_vector_mul_8192x2048_4_8col0
mem_copy_16_cores_2_chans_2048_tile_128_False
mem_copy_16_cores_2_chans_2048_tile_128_False0
mem_copy_1_cols_1_channels_2048_tile_2048
mem_copy_1_cols_2_channels_2048_tile_1024
mem_copy_1_cores_1_chans_2048_tile_2048_False
mem_copy_1_cores_1_chans_2048_tile_2048_False0
mem_copy_2_cols_1_channels_2048_tile_1024
mem_copy_2_cols_2_channels_2048_tile_512
mem_copy_2_cores_1_chans_2048_tile_1024_False
mem_copy_2_cores_1_chans_2048_tile_1024_False0
mem_copy_2_cores_2_chans_2048_tile_1024_False
mem_copy_2_cores_2_chans_2048_tile_1024_False0
mem_copy_4_cols_1_channels_2048_tile_512
mem_copy_4_cols_2_channels_2048_tile_256
mem_copy_4_cores_1_chans_2048_tile_512_False
mem_copy_4_cores_1_chans_2048_tile_512_False0
mem_copy_4_cores_2_chans_2048_tile_512_False
mem_copy_4_cores_2_chans_2048_tile_512_False0
mem_copy_8_cols_1_channels_2048_tile_256
mem_copy_8_cols_2_channels_2048_tile_128
mem_copy_8_cores_1_chans_2048_tile_256_False
mem_copy_8_cores_1_chans_2048_tile_256_False0
mem_copy_8_cores_2_chans_2048_tile_256_False
mem_copy_8_cores_2_chans_2048_tile_256_False0
mha
mha0
relu_1_cols_1_channels_2048_tile_2048
relu_2_cols_1_channels_2048_tile_1024
relu_4_cols_1_channels_2048_tile_512
relu_8_cols_1_channels_2048_tile_256
rms_norm_1_cols_1_channels_2048_tile_2048
rms_norm_1_cols_2_channels_2048_tile_1024
rms_norm_2_cols_1_channels_2048_tile_1024
rms_norm_2_cols_2_channels_2048_tile_512
rms_norm_4_cols_1_channels_2048_tile_512
rms_norm_4_cols_2_channels_2048_tile_256
rms_norm_8_cols_1_channels_2048_tile_256
rms_norm_8_cols_2_channels_2048_tile_128
rope_1_cols_2_channels_4096_tile_4096_0
rope_2_cols_2_channels_4096_tile_2048_0
rope_4_cols_2_channels_4096_tile_1024_0
rope_8_cols_2_channels_4096_tile_512_0
sigmoid_1_cols_1_channels_2048_tile_2048
sigmoid_2_cols_1_channels_2048_tile_1024
sigmoid_4_cols_1_channels_2048_tile_512
sigmoid_8_cols_1_channels_2048_tile_256
silu_1_cols_1_channels_2048_tile_2048
silu_2_cols_1_channels_2048_tile_1024
silu_4_cols_1_channels_2048_tile_512
silu_8_cols_1_channels_2048_tile_256
softmax_1_cols_2_channels_4096_tile_2048
softmax_2_cols_2_channels_4096_tile_1024
softmax_2_cols_2_channels_4096_tile_512
swigluNo metrics available. swiglu_decode_1x2048x2048
swiglu_decode_1x2048x2048_0
tanh_1_cols_1_channels_2048_tile_2048
tanh_2_cols_1_channels_2048_tile_1024
tanh_4_cols_1_channels_2048_tile_512
tanh_8_cols_1_channels_2048_tile_256
transpose_2048_M_64_N_1_cols_1_channels_64_m_64_n_8_s
transpose_2048_M_64_N_1_cols_1_channels_64_m_64_n_8_s0
transpose_2048_M_64_N_1_cols_2_channels_64_m_64_n_8_s
transpose_2048_M_64_N_1_cols_2_channels_64_m_64_n_8_s0
weighted_rms_norm_1_cols_2_channels_2048_weights_2048
weighted_rms_norm_2_cols_2_channels_2048_weights_1024
weighted_rms_norm_4_cols_2_channels_2048_weights_512
weighted_rms_norm_8_cols_2_channels_2048_weights_256
|
| acc_buffer = Buffer( | ||
| type=C_l1_ty_internal, name=f"acc_buffer_{row}_{col}" | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can buffers not be defined inside the core function? I guess there's no required MLIR context there?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Buffer is added to the context when it is passed to the function but this is transparent to the user. But yes, instead of having a GlobalBuffer and a LocalBuffer there is now just Buffer that must be globally declared.
This PR updates the mlir-aie version from v1.1.3 to the latest release, v1.1.4.
Added
Changed
Removed
PR Merge Checklist
develcommit and pointing todevel.