Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion Projects/Projects/Always-On-AI-with-Ethos-U85-NPU.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,10 +67,16 @@ You should either be familiar with, or willing to learn about, the following:

## Resources from Arm and our partners
- Arm Developer: [Edge AI](https://developer.arm.com/edge-ai)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
- Arm Alif Code-along: [Advanced AI on Arm Embedded Systems](https://developer.arm.com/code-along/advanced-ai-on-arm-embedded-systems)
- Code-along repo: [Ethos-U Workshop](https://github.com/ArmDeveloperEcosystem/workshop-ethos-u)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
- Learning Path: [Visualize Ethos-U NPU performance with ExecuTorch on Arm FVPs](https://learn.arm.com/learning-paths/embedded-and-microcontrollers/visualizing-ethos-u-performance)
- Repository: [AI on Arm course](https://github.com/arm-university/AI-on-Arm)
- Example Board: [Alif Ensemble DevKit E8](https://www.keil.arm.com/boards/alif-semiconductor-devkit-e8-gen-1-2558a7b/features/)
- Documentation: [TOSA Specification](https://www.mlplatform.org/tosa/), [TOSA Model Explorer](https://github.com/arm/tosa-adapter-model-explorer), and [TOSA Reference Model](https://gitlab.arm.com/tosa/tosa-reference-model)
- [Model Explorer](https://ai.google.dev/edge/model-explorer)
- PyTorch Blog: [ExecuTorch support for Ethos-U85](https://pytorch.org/blog/pt-executorch-ethos-u85/)
- PyTorch Tutorial: [Arm Ethos-U NPU Backend Tutorial](https://docs.pytorch.org/executorch/1.0/tutorial-arm-ethos-u.html)

## Support Level

Expand Down
1 change: 1 addition & 0 deletions Projects/Projects/Edge-AI-On-Mobile.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,6 +61,7 @@ Utilise the resources and learning paths below and create an exciting and challe
## Resources from Arm and our partners

- Arm Developer: [Launchpad - Mobile AI](https://developer.arm.com/mobile-graphics-and-gaming/ai-mobile)
- Learning Path: [Profile ExecuTorch models with SME2 on Arm](https://learn.arm.com/learning-paths/cross-platform/sme-executorch-profiling/)
- Learning Path: [Mobile AI/ML Performance Profiling](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/profiling-ml-on-arm/)
- Learning Path: [Build an Android chat app with Llama, KleidiAI, ExecuTorch, and XNNPACK](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/build-llama3-chat-android-app-using-executorch-and-xnnpack/)
- Learning Path: [Vision LLM Inference on Android with KleidiAI](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/vision-llm-inference-on-android-with-kleidiai-and-mnn/)
Expand Down
39 changes: 16 additions & 23 deletions Projects/Projects/Ethos-U85-NPU-Applications.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,74 +42,67 @@ This project challenges you to explore the boundaries of what’s possible on Et

**Project Summary**

Using hardware such as the Alif Ensemble E4/E6/E8 DevKits (all include Ethos-U85) or a comparable platform or Arm Fixed Virtual Platform Corstone-320, your task is to design and benchmark an advanced edge inference application that exploits the Ethos-U85’s compute and transformer capabilities.
Using hardware such as the Alif Ensemble E4/E6/E8 DevKits (all include Ethos-U85), your task is to design and benchmark an advanced edge inference application that exploits the Ethos-U85’s compute and transformer capabilities.

You can utilise the Arm Fixed Virtual Platform Corstone-320 to prototype and test your application functionally, without access to Alif hardware. You can use this to prove functional correctness - and can then later test performance on actual silicon. We are interested to see projects both in simulation, and on final hardware.

Your project should include:

1. Model Deployment and Optimization
**Model Deployment and Optimization**
Select a computationally intensive model — ideally transformer-based or multi-branch convolutional — and deploy it on the Ethos-U85 using:
- The TOSA Model Explorer extension to inspect and adapt unsupported or experimental models for TOSA compliance.
- Model Explorer to inspect models and identify problem layers that reduce optimal delegation to the Ethos-U backend
- The Vela compiler for optimization.

These tools can be used to:
- Convert and visualize model graphs in TOSA format.
- Identify unsupported operators.
- Modify or substitute layers for compatibility using the Flatbuffers schema before re-exporting.
- Run Vela for optimized compilation targeting Ethos-U85.

2. Application Demonstration
**Application Demonstration**
Implement a working example that highlights the Ethos-U85’s strengths in real-world inference. Possible categories include:
- Transformers on Edge: lightweight BERT, ViT, or audio transformers (e.g. speech or sound event classification).
- High-resolution Vision: semantic segmentation, object detection on large input sizes, or multi-head perception networks.
- Multi-modal Fusion: combining audio, image, or sensor streams for contextual understanding.

3. Analysis and Benchmarking
**Analysis and Benchmarking**
Report quantitative results on:
- Inference latency, throughput (FPS or tokens/s), and memory footprint.
- Power efficiency under load (optional).
- Comparative performance versus Ethos-U55/U65 (use available benchmarks for reference or utilise the other Ethos-U NPUs provided in the Alif DevKits).
- The effect of TOSA optimization — demonstrate measurable improvements from graph conversion and operator fusion.

---

## What kind of projects should you target?

To clearly demonstrate the leap from Ethos-U55/U65 to U85, choose projects that meet at least one of the following criteria:

- Transformer-heavy architectures: e.g. attention blocks, transformer encoders, ViTs, or hybrid CNN+transformer models.
- *Example:* an audio event detection transformer that must process longer sequences or higher-resolution spectrograms.
- High-resolution or multi-branch networks: models with high input dimensionality or multiple processing paths that saturate NPU throughput.
- *Example:* 512×512 semantic segmentation or multi-object detection.
- High-resolution or multi-branch networks: models with high input dimensionality or multiple processing paths that saturate NPU throughput.
- Dense post-processing or large fully connected layers: cases where U55/U65 memory limits or MAC bandwidth previously restricted performance.
- *Example:* large MLP heads or transformer token mixers.
- Multi-modal pipelines: combining multiple sensor inputs (e.g. image + IMU + audio) where the NPU must maintain concurrency or shared intermediate representations.

The Ethos-U85 is ideal for projects where model performance is constrained by attention layers, large activations, or operator types that previously required fallback to the CPU. Use the Ethos-U85 to eliminate those fallbacks and achieve full-NPU execution of advanced topologies.

---

## What will you use?
You should be familiar with, or willing to learn about:
- Programming: Python, C/C++
- ExecuTorch or TensorFlow Lite (Micro/LiteRT)
- ExecuTorch or LiteRT
- Techniques for optimising AI models for the edge (quantization, pruning, etc.)
- Optimization Tools:
- TOSA Model Explorer
- .tflite to .tosa converter (if using Tensorflow rather than ExecuTorch)
- Model Explorer with TOSA adapter (and PTE adapter for ExecuTorch)
- Vela compiler for Ethos-U
- Bare-metal or RTOS (e.g., Zephyr)

---

## Resources from Arm and our partners
- Arm Developer: [Edge AI](https://developer.arm.com/edge-ai)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
- Arm Alif Code-along: [Advanced AI on Arm Embedded Systems](https://developer.arm.com/code-along/advanced-ai-on-arm-embedded-systems)
- Code-along repo: [Ethos-U Workshop](https://github.com/ArmDeveloperEcosystem/workshop-ethos-u)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
- Learning Path: [Visualize Ethos-U NPU performance with ExecuTorch on Arm FVPs](https://learn.arm.com/learning-paths/embedded-and-microcontrollers/visualizing-ethos-u-performance)
- Repository: [AI on Arm course](https://github.com/arm-university/AI-on-Arm)
- Example Board: [Alif Ensemble DevKit E8](https://www.keil.arm.com/boards/alif-semiconductor-devkit-e8-gen-1-2558a7b/features/)
- Documentation: [TOSA Specification](https://www.mlplatform.org/tosa/), [TOSA Model Explorer](https://github.com/arm/tosa-adapter-model-explorer), and [TOSA Reference Model](https://gitlab.arm.com/tosa/tosa-reference-model)
- [Model Explorer](https://ai.google.dev/edge/model-explorer)
- PyTorch Blog: [ExecuTorch support for Ethos-U85](https://pytorch.org/blog/pt-executorch-ethos-u85/)
- PyTorch Tutorial: [Arm Ethos-U NPU Backend Tutorial](https://docs.pytorch.org/executorch/1.0/tutorial-arm-ethos-u.html)

---

## Support Level

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: "Game development using Arm Neural Graphics with Unreal Engine"
description: "Build a playable Unreal Engine 5 game demo that utilises Arm’s Neural Graphics SDK UE plugin for features such as Neural Super Sampling (NSS). Showcase near-identical image quality at lower resolution by driving neural rendering directly in the graphics pipeline."
description: "Build a playable Unreal Engine 5 game demo that utilises Arm’s Neural Graphics SDK UE plugin for features such as Neural Super Sampling (NSS). Showcase improved graphical fidelity at lower resolution by driving neural rendering directly in the graphics pipeline."
subjects:
- "ML"
- "Gaming"
Expand Down Expand Up @@ -28,7 +28,7 @@ badges:
donation:
---

![educate_on_arm](../../images/Educate_on_Arm_banner.png)
![learn_on_arm](../../images/Learn_on_Arm_banner.png)

## Description

Expand All @@ -47,7 +47,7 @@ Future SDK support will be provided for Neural Frame Rate Upscaling (NFRU) - so
### Project Summary

Create a small game scene utilising the Arm Neural Graphics UE plugin to demonstrate:
- **Near-identical visuals at lower resolution** (render low → upscale with NSS)
- **Improved graphical fidelity despite lower resolution** (render low → upscale with NSS)

Document your progress and findings and consider alternative applications of the neural technology within games development.

Expand All @@ -60,7 +60,9 @@ Attempt different environments and objects. For example:

Make your scenes dynamic with particle effects, shadows, physics and motion.

---
**Beyond the plugin**

Want to go further and start experimenting more with Neural Graphics? After building your game with the NSS Unreal plugin, try-out the [Vulkan ML Extensions learning path](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/vulkan-ml-sample/) to explore how neural inference runs directly through the Vulkan API. This provides lower-level control over ML workloads in the graphics pipeline, and allows for prototyping custom neural effects or optimising performance beyond what’s exposed through the engine plugin. You may also want to explore [fine-tuning your own neural models with the Arm Neural Graphics Model Gym](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/model-training-gym/) and how to [apply different quantization strategies](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/quantize-neural-upscaling-models/) for optimisation of memory and latency.

## Pre-requisites
- Laptop/PC/Mobile for Android Unreal Engine game development
Expand All @@ -72,11 +74,11 @@ Make your scenes dynamic with particle effects, shadows, physics and motion.
- Get Started Blog: [Start experimenting with NSS today](https://developer.arm.com/community/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/how-to-access-arm-neural-super-sampling)
- Deep Dive Blog: [How NSS works](https://developer.arm.com/community/arm-community-blogs/b/mobile-graphics-and-gaming-blog/posts/how-arm-neural-super-sampling-works)
- Arm Developer: [Neural Graphics Development Kit](https://developer.arm.com/mobile-graphics-and-gaming/neural-graphics)
- Learning Path: [Fine-tuning neural graphics models with Model Gym](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/model-training-gym/)
- Learning Path: [Neural Super Sampling in Unreal Engine](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/nss-unreal/)
- Learning Path: [Getting started with Arm Accuracy Super Resolution (Arm ASR)](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/get-started-with-arm-asr/)
- Unreal Engine Intro by Epic Games: [Understanding the basics](https://dev.epicgames.com/documentation/en-us/unreal-engine/understanding-the-basics-of-unreal-engine)
- Repo: [Arm Neural Graphics SDK](https://github.com/arm/neural-graphics-sdk-for-game-engines)
- Repo: [Arm Neural Geraphics for Unreal](https://github.com/arm/neural-graphics-for-unreal)
- Repo: [Arm Neural Graphics Model Gym](https://github.com/arm/neural-graphics-model-gym)
- Documentation: [Arm Neural Graphics SDK for Game Engines Developer guide](https://developer.arm.com/documentation/111167/latest/)

Expand All @@ -88,7 +90,7 @@ This project is designed to be self-serve but comes with opportunity of some com

## Benefits

Standout project contributions to the community will earn digital badges. These badges can support CV or resumé building and demonstrate earned recognition.
Standout project contributions to the community will earn digital badges. These badges can support CV or resumé building and demonstrate earned recognition. Contributions may also be highlighted in case studies or newsletters.


To receive the benefits, you must show us your project through our [online form](https://forms.office.com/e/VZnJQLeRhD). Please do not include any confidential information in your contribution. Additionally if you are affiliated with an academic institution, please ensure you have the right to share your material.
Expand Down
2 changes: 1 addition & 1 deletion docs/_data/navigation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,7 @@ projects:
- title: Game-Dev-Using-Neural-Graphics-&-Unreal-Engine
description: "Build a playable Unreal Engine 5 game demo that utilises Arm\u2019\
s Neural Graphics SDK UE plugin for features such as Neural Super Sampling (NSS).\
\ Showcase near-identical image quality at lower resolution by driving neural\
\ Showcase improved graphical fidelity at lower resolution by driving neural\
\ rendering directly in the graphics pipeline."
url: /2025/11/27/Game-Dev-Using-Neural-Graphics-&-Unreal-Engine.html
subjects:
Expand Down
16 changes: 14 additions & 2 deletions docs/_posts/2025-11-27-Always-On-AI-with-Ethos-U85-NPU.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,10 +69,16 @@ full_description: |-

## Resources from Arm and our partners
- Arm Developer: [Edge AI](https://developer.arm.com/edge-ai)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
- Arm Alif Code-along: [Advanced AI on Arm Embedded Systems](https://developer.arm.com/code-along/advanced-ai-on-arm-embedded-systems)
- Code-along repo: [Ethos-U Workshop](https://github.com/ArmDeveloperEcosystem/workshop-ethos-u)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
- Learning Path: [Visualize Ethos-U NPU performance with ExecuTorch on Arm FVPs](https://learn.arm.com/learning-paths/embedded-and-microcontrollers/visualizing-ethos-u-performance)
- Repository: [AI on Arm course](https://github.com/arm-university/AI-on-Arm)
- Example Board: [Alif Ensemble DevKit E8](https://www.keil.arm.com/boards/alif-semiconductor-devkit-e8-gen-1-2558a7b/features/)
- Documentation: [TOSA Specification](https://www.mlplatform.org/tosa/), [TOSA Model Explorer](https://github.com/arm/tosa-adapter-model-explorer), and [TOSA Reference Model](https://gitlab.arm.com/tosa/tosa-reference-model)
- [Model Explorer](https://ai.google.dev/edge/model-explorer)
- PyTorch Blog: [ExecuTorch support for Ethos-U85](https://pytorch.org/blog/pt-executorch-ethos-u85/)
- PyTorch Tutorial: [Arm Ethos-U NPU Backend Tutorial](https://docs.pytorch.org/executorch/1.0/tutorial-arm-ethos-u.html)

## Support Level

Expand Down Expand Up @@ -124,10 +130,16 @@ You should either be familiar with, or willing to learn about, the following:

## Resources from Arm and our partners
- Arm Developer: [Edge AI](https://developer.arm.com/edge-ai)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
- Arm Alif Code-along: [Advanced AI on Arm Embedded Systems](https://developer.arm.com/code-along/advanced-ai-on-arm-embedded-systems)
- Code-along repo: [Ethos-U Workshop](https://github.com/ArmDeveloperEcosystem/workshop-ethos-u)
- Learning Path: [Navigating Machine Learning with Ethos-U processors](https://learn.arm.com/learning-paths/microcontrollers/nav-mlek/)
- Learning Path: [Visualize Ethos-U NPU performance with ExecuTorch on Arm FVPs](https://learn.arm.com/learning-paths/embedded-and-microcontrollers/visualizing-ethos-u-performance)
- Repository: [AI on Arm course](https://github.com/arm-university/AI-on-Arm)
- Example Board: [Alif Ensemble DevKit E8](https://www.keil.arm.com/boards/alif-semiconductor-devkit-e8-gen-1-2558a7b/features/)
- Documentation: [TOSA Specification](https://www.mlplatform.org/tosa/), [TOSA Model Explorer](https://github.com/arm/tosa-adapter-model-explorer), and [TOSA Reference Model](https://gitlab.arm.com/tosa/tosa-reference-model)
- [Model Explorer](https://ai.google.dev/edge/model-explorer)
- PyTorch Blog: [ExecuTorch support for Ethos-U85](https://pytorch.org/blog/pt-executorch-ethos-u85/)
- PyTorch Tutorial: [Arm Ethos-U NPU Backend Tutorial](https://docs.pytorch.org/executorch/1.0/tutorial-arm-ethos-u.html)

## Support Level

Expand Down
2 changes: 2 additions & 0 deletions docs/_posts/2025-11-27-Edge-AI-On-Mobile.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,7 @@ full_description: |-
## Resources from Arm and our partners

- Arm Developer: [Launchpad - Mobile AI](https://developer.arm.com/mobile-graphics-and-gaming/ai-mobile)
- Learning Path: [Profile ExecuTorch models with SME2 on Arm](https://learn.arm.com/learning-paths/cross-platform/sme-executorch-profiling/)
- Learning Path: [Mobile AI/ML Performance Profiling](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/profiling-ml-on-arm/)
- Learning Path: [Build an Android chat app with Llama, KleidiAI, ExecuTorch, and XNNPACK](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/build-llama3-chat-android-app-using-executorch-and-xnnpack/)
- Learning Path: [Vision LLM Inference on Android with KleidiAI](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/vision-llm-inference-on-android-with-kleidiai-and-mnn/)
Expand Down Expand Up @@ -121,6 +122,7 @@ Utilise the resources and learning paths below and create an exciting and challe
## Resources from Arm and our partners

- Arm Developer: [Launchpad - Mobile AI](https://developer.arm.com/mobile-graphics-and-gaming/ai-mobile)
- Learning Path: [Profile ExecuTorch models with SME2 on Arm](https://learn.arm.com/learning-paths/cross-platform/sme-executorch-profiling/)
- Learning Path: [Mobile AI/ML Performance Profiling](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/profiling-ml-on-arm/)
- Learning Path: [Build an Android chat app with Llama, KleidiAI, ExecuTorch, and XNNPACK](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/build-llama3-chat-android-app-using-executorch-and-xnnpack/)
- Learning Path: [Vision LLM Inference on Android with KleidiAI](https://learn.arm.com/learning-paths/mobile-graphics-and-gaming/vision-llm-inference-on-android-with-kleidiai-and-mnn/)
Expand Down
Loading