1
0
mirror of https://github.com/tiyn/wiki.git synced 2026-03-18 20:14:47 +01:00
Files
wiki/wiki/hailo.md

3.5 KiB

Hailo

Hailo produces AI accelerator chips which are used for deep learning. An example for a system that uses it is Raspberry Pi's AI Hat +.

Setup

Depending on the system and the type of access various things may have to be setup. For Ubuntu systems like the Raspberry Pi these are often named hailort-pcie-driver and hailort. When using the Python prorgamming language refer to the Hailo section.

Usage

This section addresses various usages of the Hailo software.

Preparing TensorFlow Models for the AI HAT+

For neural networks to run on the Hailo AI module and the AI HAT+ they have to be converted to the .hef format. This section assumes the neural network is using TensorFlow and is available as a .tf or .tflite file.

To convert TensorFlow models first the Hailo 8 Software Suite needs to be downloaded. This can be done from the official website altough an account is needed for it.

After downloading, extracting and then navigating into the folder a heavily customized Docker container can be started by running the following command. However it is recommended to slightly modify this file. Add a volume that contains the TensorFlow model, that is to be converted, to the environment variable DOCKER_ARGS which is set in the file hailo_ai_sw_suite_docker_run.sh.

./hailo_ai_sw_suite_docker_run.sh

Using the tools which come in this container a .tf or .tflite model can be converted to the .hef format.

For this to work run the following commands inside the Docker container. The first command takes the path to the tensorflow model (<path-to-tf-model>) and will output a .har model. The second command is optional but recommended and takes the path to this .har model (<path-to-har-model) and returns an optimized .har model. The third and final command compiles the (optimized) .har model, which is given as the input, and outputs the final .hre model, which then can be used with the Hailo AI module.

hailo parser tf <path-to-tf-model>
hailo optimize --use-random-calib-set <path-to-har-model>
hailo compiler <path-to-optimized-har-model>

If a calibration data set is available the following command can be used to optimize the model using this data set. <data-set> is the path of the data set. This is a directory containing .npy files that store single data points (for example images) in the input format.

hailo optimize --calib-set-path <data-set> <path-to-har-model>

Note that the user in the Docker container usually uses anothr UID and GID. To make the volume and files accessible inside the container the IDs of the files in the volume should be changed accordingly - for example as shown in the following example. <volume-path> is the path that points to the volume <uid> is the UID of the Docker user - which can be found using id -u (for example 10642) - and <gid> the GID of the Docker user - which can be found using id -g (for example 10600).

chown -R <uid>:<gid> <volume-path>

After the models have been converted it can be reversed using the systems user UID and GID.

The converted models can than be run using the Python programming language as described in the Python article.