diff --git a/wiki/hailo.md b/wiki/hailo.md new file mode 100644 index 0000000..bc5f248 --- /dev/null +++ b/wiki/hailo.md @@ -0,0 +1,64 @@ +# Hailo + +[Hailo](https://hailo.ai) produces AI accelerator chips which are used for deep learning. +An example for a system that uses it is +[Raspberry Pi's AI Hat +](/wiki/linux/raspberry_pi.md#ai-hat). + +## Usage + +This section addresses various usages of the Hailo software. + +### Preparing TensorFlow Models for the AI HAT+ + +For [neural networks](/wiki/neural_network.md) to run on the Hailo AI module and the AI HAT+ they +have to be converted to the `.hef` format. +This section assumes the neural network is using +[TensorFlow](/wiki/programming_language/python.md#tensorflow) and is available as a `.tf` or +`.tflite` file. + +To convert TensorFlow models first the Hailo 8 Software Suite needs to be downloaded. +This can be done from the [official website](https://hailo.ai/developer-zone/software-downloads/) +altough an account is needed for it. + +After downloading, extracting and then navigating into the folder a heavily customized +[Docker](/wiki/docker.md) container can be started by running the following command. +However it is recommended to slightly modify this file. +Add a volume that contains the TensorFlow model, that is to be converted, to the environment +variable `DOCKER_ARGS` which is set in the file `hailo_ai_sw_suite_docker_run.sh`. + +```sh +./hailo_ai_sw_suite_docker_run.sh +``` + +Using the tools which come in this container a `.tf` or `.tflite` model can be converted to the +`.hef` format. + +For this to work run the following commands inside the Docker container. +The first command takes the path to the tensorflow model (``) and will output a +`.har` model. +The second command is optional but recommended and takes the path to this `.har` model +(` +hailo optimize --use-random-calib-set +hailo compiler +``` + +Note that the user in the Docker container usually uses anothr UID and GID. +To make the volume and files accessible inside the container the IDs of the files in the volume +should be changed accordingly - for example as shown in the following example. +`` is the path that points to the volume +`` is the UID of the Docker user - which can be found using `id -u` (for example `10642`) - +and `` the GID of the Docker user - which can be found using `id -g` (for example `10600`). + +```sh +chown -R : +``` + +After the models have been converted it can be reversed using the systems user UID and GID. + +The converted models can than be run using the Python programming language as described in the +[Python article](/wiki/programming_language/python.md#hailo). diff --git a/wiki/linux/raspberry_pi.md b/wiki/linux/raspberry_pi.md index 5f139f1..76227bd 100644 --- a/wiki/linux/raspberry_pi.md +++ b/wiki/linux/raspberry_pi.md @@ -33,62 +33,5 @@ This section addresses them. ### AI HAT+ The [AI HAT](https://www.raspberrypi.com/documentation/accessories/ai-hat-plus.html) is an -extension which uses the Hailo AI module for use with the [Raspberry Pi -5](https://www.raspberrypi.com/products/raspberry-pi-5). - -#### AI HAT+ Usage - -This section addresses the usage of the -[AI HAT](https://www.raspberrypi.com/documentation/accessories/ai-hat-plus.html). - -#### Preparing TensorFlow Models for the AI HAT+ - -For [neural networks](/wiki/neural_network.md) to run on the Hailo AI module and the AI HAT+ they -have to be converted to the `.hef` format. -This section assumes the neural network is using -[TensorFlow](/wiki/programming_language/python.md#tensorflow) and is available as a `.tf` or -`.tflite` file. - -To convert TensorFlow models first the Hailo 8 Software Suite needs to be downloaded. -This can be done from the [official website](https://hailo.ai/developer-zone/software-downloads/) -altough an account is needed for it. - -After downloading, extracting and then navigating into the folder a heavily customized -[Docker](/wiki/docker.md) container can be started by running the following command. -However it is recommended to slightly modify this file. -Add a volume that contains the TensorFlow model, that is to be converted, to the environment -variable `DOCKER_ARGS` which is set in the file `hailo_ai_sw_suite_docker_run.sh`. - -```sh -./hailo_ai_sw_suite_docker_run.sh -``` - -Using the tools which come in this container a `.tf` or `.tflite` model can be converted to the -`.hef` format. - -For this to work run the following commands inside the Docker container. -The first command takes the path to the tensorflow model (``) and will output a -`.har` model. -The second command is optional but recommended and takes the path to this `.har` model -(` -hailo optimize --use-random-calib-set -hailo compiler -``` - -Note that the user in the Docker container usually uses anothr UID and GID. -To make the volume and files accessible inside the container the IDs of the files in the volume -should be changed accordingly - for example as shown in the following example. -`` is the path that points to the volume -`` is the UID of the Docker user - which can be found using `id -u` (for example `10642`) - -and `` the GID of the Docker user - which can be found using `id -g` (for example `10600`). - -```sh -chown -R : -``` - -After the models have been converted it can be reversed using the systems user UID and GID. +extension which uses the [Hailo AI module](/wiki/hailo.md) for use with the +[Raspberry Pi 5](https://www.raspberrypi.com/products/raspberry-pi-5). diff --git a/wiki/programming_language/python.md b/wiki/programming_language/python.md index 0af5ce9..21d5506 100644 --- a/wiki/programming_language/python.md +++ b/wiki/programming_language/python.md @@ -189,6 +189,20 @@ torch.cuda.is_available() This should give back `True`. +### Hailo + +The package for [Hailo chips](/wiki/hailo.md) has to be downloaded from the +[official website](https://hailo.ai/developer-zone/software-downloads). + +Hailo chips can be used to run converted [TensorFlow](#tensorflow) models. +The conversion process is explained in the +[Hailo article](/wiki/hailo.md#preparing-tensorflow-models-for-the-ai-hat) + +To run the inference using Python on ARM boards like the +[Raspberry Pi AI Hat +](/wiki/linux/raspberry_pi.md#ai-hat) +[zlodeibaal's article in Medium](https://medium.com/@zlodeibaal/how-to-run-hailo-on-arm-boards-d2ad599311fa) +can be referenced. + ### TensorFlow This section addresses the [TensorFlow module](https://www.tensorflow.org/).