Ei For Amr - Developer Guide - 2022.3.1 767160 768295
Ei For Amr - Developer Guide - 2022.3.1 767160 768295
Ei For Amr - Developer Guide - 2022.3.1 767160 768295
Contents
Chapter 1: Edge Insights for Autonomous Mobile Robots (EI for AMR)
How it Works ............................................................................................4
Recommended Hardware .......................................................................... 12
edgesoftware Command Line Interface (CLI)............................................... 13
EI for AMR Robot Tutorials ........................................................................ 20
UPS 6000 and UP Xtreme i11 Robot Kits............................................. 20
Create Your Own Robot Kit ............................................................... 36
Step 1: Hardware Assembly ..................................................... 36
Step 2: Integration into Edge Insights for Autonomous Mobile
Robots............................................................................... 37
Step 3: Robot Base Node ROS 2 Node ....................................... 37
Step 4: Robot Base Node ROS 2 Navigation Parameter File ........... 39
Step 5: Navigation Full Stack.................................................... 40
Perception...................................................................................... 46
Intel® RealSense™ ROS 2 Sample Application............................... 46
ROS 2 OpenVINO™ Toolkit Sample Application ............................. 49
OpenVINO™ Sample Application................................................. 52
2D LIDAR and ROS 2 Cartographer............................................ 57
GStreamer* Pipelines .............................................................. 64
Point Cloud Library (PCL) Optimized for the Intel® oneAPI Base
Toolkit ............................................................................... 73
Navigation ................................................................................... 109
Collaborative Visual SLAM ...................................................... 109
Kudan Visual SLAM ............................................................... 122
FastMapping Algorithm .......................................................... 131
ADBSCAN Algorithm .............................................................. 133
ITS Path Planner ROS 2 Navigation Plugin ................................ 136
ITS Path Planner Plugin Customization ..................................... 141
Intel® oneAPI Base Toolkit Sample Application........................... 142
Robot Teleop Using a Gamepad ............................................... 145
Robot Teleop Using a Keyboard ............................................... 146
Simulation ................................................................................... 147
turtlesim.............................................................................. 147
Wandering Application in a ARIAC Gazebo* Simulation............... 150
Wandering Application in a Waffle Gazebo* Simulation ............... 152
Benchmarking and Profiling ............................................................ 154
VTune™ Profiler in a Docker* Container..................................... 154
OpenVINO™ Benchmarking Tool ............................................... 157
EI for AMR Container on a Virtual Machine ........................................ 160
Fibocom’s FM350 5G Module Integration........................................... 165
Change Existing and Add New Docker* Images to the EI for AMR SDK . 168
Troubleshooting for Robot Tutorials .................................................. 171
EI for AMR Robot Orchestration Tutorials .................................................. 174
Device Onboarding End-to-End Use Case.......................................... 174
Basic Fleet Management................................................................. 196
Basic Fleet Management Use Case........................................... 198
Remote Inference End-to-End Use Case ................................... 210
OTA Updates ................................................................................ 212
2
Contents
3
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Collateral Description
edgesoftware Command Line Interface (CLI) Intel’s Developer Catalog package management
Troubleshooting for Robot Orchestration Tutorials Help with robot orchestration tutorials
Get Started Guide for Robot Orchestration Server Complete Kit installation
How it Works
The Edge Insights for Autonomous Mobile Robots (EI for AMR) modules are deployed via Docker* containers
for enhanced Developer Experience (DX), support of Continuous Integration and Continuous Deployment
(CI/CD) practices and flexible deployment in different execution environments, including robot, development
PC, server, and cloud.
This section provides an overview of the modules and services featured with Edge Insights for Autonomous
Mobile Robots.
4
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
The ROS 2 with data distribution service (DDS) is used as a message bus. This Publisher-Subscriber
architecture based on ROS 2 topics decouples data providers from consumers.
Camera and LIDAR sensor data is abstracted with ROS 2 topics.
Video streaming processing pipelines are supported by GStreamer*. GStreamer* is a library for constructing
graphs of media-handling components. It decouples sensor ingestion, video processing and AI object
detection via OpenVINO™ toolkit DL Streamer framework. The applications it supports range from simple Ogg
Vorbis playback audio and video streaming to complex audio (mixing) and video (non-linear editing)
processing.
Also, more complex computational graphs that decouple Sense-Plan-Act autonomous mobile robot
applications can be implemented using ROS 2 topic registration.
This diagram shows the software components included in the EI for AMR package. The software stack keeps
evolving iteratively with additional algorithms, applications, and third-party ecosystem software components.
5
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
6
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
The EI for AMR software stack is based on software supported by and part of the underlying hardware
platform, their respective Unified Extensible Firmware Interface (UEFI) based boot, and their supported
Linux* operating system. For requirement details, see:
• Get Started Guide for Robots
• Get Started Guide for Robot Orchestration
7
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
• ROS 2, Robot Operating System (ROS), which is a set of open source software libraries and tools for
building robot applications
ROS 2 depends on other middleware, like the Object Management Group (OMG) DDS connectivity
framework that is using a publish-subscribe pattern. The standard ROS 2 distribution includes eProsima
Fast DDS implementation.
• ROS 2 Battery Bridge, which utilizes the Battery Bridge Kernel Module to forward battery information from
an EI for AMR’s microcontroller into the Linux kernel
• RPLIDAR ROS 2 Wrapper node, for using RPLIDAR LIDAR sensors with ROS 2
• SICK Safetyscanners ROS 2 Driver, which reads the raw data from the SICK Safety Scanners and
publishes the data as a laser_scan msg
• Teleop Twist Joy, a generic facility for teleoperating twist-based ROS 2 robots with a standard joystick. It
converts joy messages to velocity commands. This node provides no rate limiting or autorepeat
functionality. It is expected that you take advantage of the features built into ROS 2 Driver for Generic
Joysticks for this.
• Teleop Twist Keyboard generic keyboard teleoperation for ROS 2
• Twist Multiplexer for when there is more than one source to move a robot with a geometry_msgs::Twist
message. It is important to multiplex all input sources into a single source that goes to the EI for AMR
control node.
• ROS 2 Driver for Generic Joysticks
• ModemManager, which provides a unified high level API for communicating with mobile broadband
modems, regardless of the protocol used to communicate with the actual device (for example, generic AT,
vendor-specific AT, QCDM, QMI, MBIM). Edge Insights for Autonomous Mobile Robots uses ModemManager
to establish 5G connections using the Fibocom* FM350-G 5G/LTE modem.
8
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• ROS 2 Cartographer, a system that provides real-time simultaneous localization and mapping (SLAM)
based on real-time 2D LIDAR sensor data. It is used to generate as-built floor plans in the form of
occupancy grids.
• ROS 2 Depth Image to Laser Scan, which converts a depth image to a laser scan for use with navigation
and localization.
• ROS 2 Navigation stack, which seeks a safe way to have a mobile robot move from point A to point B.
This completes dynamic path planning, computes velocities for motors, detects and avoids obstacles, and
structures recovery behaviors. Navigation 2 uses behavior trees to call modular servers to complete an
action. An action can be computing a path, controlling effort, recovery, or any other navigation-related
action. These are separate nodes that communicate with the behavior tree over a ROS 2 action server.
• RTAB-Map (Real-Time Appearance-Based Mapping), a RGB-D, Stereo and Lidar Graph-Based SLAM
approach based on an incremental appearance-based loop closure detector. The loop closure detector uses
a bag-of-words approach to determinate how likely a new image comes from a previous location or a new
location. When a loop closure hypothesis is accepted, a new constraint is added to the map’s graph, then
a graph optimizer minimizes the errors in the map. A memory management approach is used to limit the
number of locations used for loop closure detection and graph optimization, so that real-time constraints
on large-scale environnements are always respected. RTAB-Map can be used alone with a handheld
Kinect, a stereo camera or a 3D lidar for 6DoF mapping, or on a robot equipped with a laser rangefinder
for 3DoF mapping.
• IMU Tools - filters and visualizers - from https://github.com/CCNYRoboticsLab/imu_tools:
• imu_filter_madgwick: A filter which fuses angular velocities, accelerations, and (optionally) magnetic
readings from a generic IMU device into an orientation.
• imu_complementary_filter: A filter which fuses angular velocities, accelerations and (optionally)
magnetic readings from a generic IMU device into an orientation quaternion using a novel approach
based on a complementary fusion.
• rviz_imu_plugin: A plugin for rviz which displays sensor_msgs::Imu messages.
• Intelligent Sampling and Two-Way Search (ITS) global path planner Robot Operating System 2 (ROS 2)
Plugin is a plugin for ROS 2 Navigation package which conducts a path planning search on a roadmap
from two directions simultaneously. The main inputs are 2D occupancy grid map, robot position, and the
goal position. The occupancy is converted into a roadmap and can be saved for future inquiries. The
output is a list of waypoints which constructs the global path. All inputs and outputs are in standard ROS
2 formats. This plugin is a global path planner module which is based on the Intelligent Sampling and
Two-Way Search (ITS). Currently, the ITS plugin does not support continuous replanning. To use this
plugin, a simple behavior tree with compute path to pose and follow path should be used. The inputs for
the ITS planner are global 2d_costmap (nav2_costmap_2d::Costmap2D), start and goal pose
(geometry_msgs::msg::PoseStamped). The outputs are 2D waypoints of the path. The ITS planner gets
the 2d_costmap and it converts it to either Probabilistic Road Map (PRM) or Deterministic Road Map
(DRM). The generated roadmap is saved in a txt file which can be reused for multiple inquiries. Once a
roadmap is generated, the ITS conducts a two-way search to find a path from the source to destination.
Either the smoothing filter or catmull spline interpolation can be used to create a smooth and continuous
path. The generated smooth path is in the form of ROS navigation message type (nav_msgs::msg).
• Kudan Visual SLAM (KdVisual), Kudan’s proprietary visual SLAM software, has been extensively developed
and tested for use in commercial settings. Open source and other commercial algorithms struggle in many
common use cases and scenarios. Kudan Visual SLAM achieves much faster processing time, higher
accuracy, and a more robust results in dynamic situations.
• The Point Cloud Library (PCL), a standalone, large scale, open project for 2D/3D image and point cloud
processing (see also https://pointclouds.org/). The EI for AMR SDK version of PCL adds optimized
implementations of several PCL modules which allow you to offload computation to a GPU.
• Robot_localization (from https://github.com/cra-ros-pkg/robot_localization), a collection of state
estimation nodes, each of which is an implementation of a nonlinear state estimator for robots moving in
3D space. It contains two state estimation nodes, ekf_localization_node and ukf_localization_node. In
addition, robot_localization provides navsat_transform_node, which aids in the integration of GPS data.
• SLAM Toolbox, a set of tools and capabilities for 2D SLAM built by Steve Macenski that includes the
following.
9
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
• Starting, mapping, saving pgm files, and saving maps for 2D SLAM mobile robotics
• Refining, remapping, or continue mapping a saved (serialized) pose-graph at any time
• Loading a saved pose-graph continue mapping in a space while also removing extraneous information
from newly added scans (life-long mapping)
• An optimization-based localization mode built on the pose-graph. Optionally run localization mode
without a prior map for “LIDAR odometry” mode with local loop closures
• Synchronous and asynchronous modes of mapping
• Kinematic map merging (with an elastic graph manipulation merging technique in the works)
• Plugin-based optimization solvers with an optimized Google* Ceres-based plugin
• rviz2 plugin for interacting with the tools
• Graph manipulation tools in rviz2 to manipulate nodes and connections during mapping
• Map serialization and lossless data storage
• See also https://github.com/SteveMacenski/slam_toolbox.
10
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Edge Server Applications
• OpenVINO™ Model Server (OVMS), a high-performance system for serving machine learning models. It is
based on C++ for high scalability and optimized for Intel solutions, so that you can take advantage of the
power of the Intel® Xeon® processor or Intel’s AI accelerator and expose it over a network interface. OVMS
uses the same architecture and API as TensorFlow Serving, while applying OpenVINO for inference
execution. Inference service is provided via gRPC or REST API, making it easy to deploy new algorithms
and AI experiments.
• ThingsBoard*, an open-source IoT platform for data collection, processing, visualization, and device
management. It enables device connectivity via industry standard IoT protocols - MQTT, CoAP and HTTP
and supports both cloud and on-premises deployments.
Tools
ROS Tools
Edge Insights for Autonomous Mobile Robots is validated using ROS 2 nodes. ROS 1 is not compatible with EI
for AMR components. A ROS 1 bridge is included to allow EI for AMR components to interface with ROS 1
components.
• From the hardware perspective of the supported platforms, there are no known limitations for ROS 1
components.
• For information on porting ROS 1 applications to ROS 2, here is a guide from the ROS community.
Edge Insights for Autonomous Mobile Robots includes:
• colcon (collective construction), a command line tool to improve the workflow of building, testing, and
using multiple software packages. It automates the process, handles the ordering, and sets up the
environment to use the packages.
• rqt, a software framework of ROS 2 that implements the various GUI tools in the form of plugins.
• rviz2, a tool used to visualize ROS 2 topics.
Simulation
Edge Insights for Autonomous Mobile Robots includes:
• The Gazebo* robot simulator, making it possible to rapidly test algorithms, design robots, perform
regression testing, and train AI systems using realistic scenarios. Gazebo offers the ability to simulate
populations of robots accurately and efficiently in complex indoor and outdoor environments.
• An industrial simulation room model for Gazebo*, the Open Source Robotics Foundation (OSRF) Gazebo
Environment for Agile Robotics (GEAR) workcell that was used for the ARIAC competition in 2018.
Other Tools
Edge Insights for Autonomous Mobile Robots includes:
• Intel® oneAPI Base Toolkit, which includes the DPC++ compiler and compatibility tool, as well as
debugging and profiling tools like VTune™ Profiler, etc. (formerly known as Intel System Studio).
• OpenVINO™ Tools, including the model optimization tool.
Deployment
All applications, algorithms, and middleware components which are executed as standalone processes are
deployed in their own Docker* containers. This allows you to selectively pull these components onto an EI for
AMR or Edge Server and launch them there.
For development purposes, the middleware libraries and all tools are deployed in a single container called
Full SDK. This container is constructed hierarchically by extending the OpenVINO SDK container, which itself
extends the ROS2 SDK container. For storage space savings, you can choose to run any of the containers
depending on the needs of your application.
11
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
• The ROS2 SDK container includes the ROS 2 middleware and tools, Intel® RealSense™ SDK and ROS 2
wrapper, GStreamer* and build tools, ROS 2 packages (Cartographer, Navigation, RTAB_MAP) and the
FastMapping application (the Intel-optimized version of octomap).
• The OpenVINO SDK container includes the ROS2 SDK, as well as the OpenVINO™ development toolkit, the
OpenVINO™ DL GStreamer* plugins and the Wandering demonstration application.
• The Full SDK container includes the OpenVINO™ container, as well as the Intel® oneAPI Base Toolkit, the
Data Parallel C++ (DPC++) compatibility tool and profiler, analyzer tools.
Recommended Hardware
Knowledge/Experience
• You are familiar with executing Linux* commands.
• You have basic Docker* experience.
• ROS 1 or ROS 2 background recommended.
NOTE
Target System for Development and Simulations with the Robot Complete Kit
• Intel® processors:
• 11th Generation Intel® Core™ processors with Intel® Iris® Xe Integrated Graphics or Intel® UHD Graphics
• 10th Generation Intel® Core™ processors with an integrated GPU and Intel® UHD Graphics
• 16 GB RAM
• 128 GB hard drive
• Intel® RealSense™ camera D435i
• Accelerator: Intel® Movidius™ Myriad™ X VPU (optional)
• IOT Ubuntu* Desktop 20.04
• Slamtec* RPLIDAR A3 2D LIDAR (optional)
Target System for the Server Complete Kit and Robot and Server Complete Kit
• Intel® processors:
• Intel® Xeon® processor E3, E5, and E7 family
12
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• 2nd Generation Intel® Xeon® Scalable Processors
• 3rd Generation Intel® Xeon® Scalable Processors
NOTE Intel recommends the previously listed Intel® Xeon® processors for any system running resource
intensive loads. Resource intensive uses include such things as remote inference and collaborative
visual SLAM. If the server is only for fleet management, robot deployment, and robot onboarding, the
following Intel® Core™ processors are sufficient.
• 11th Generation Intel® Core™ processors with Intel® Iris® Xe Integrated Graphics or Intel® UHD Graphics
• 10th Generation Intel® Core™ processors with an integrated GPU and Intel® UHD Graphics
• 16 GB RAM
• 128 GB hard drive
• Ubuntu* 20.04 LTS
./edgesoftware --help
• Response:
Options:
-v, --version Show the version number and exit.
--help Show this message and exit.
Commands:
download Download modules of a package.
export Exports the modules installed as a part of a package.
install Install modules of a package.
list List the modules of a package.
log Show log of CLI events.
pull Pull Docker image.
13
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
./edgesoftware --version
• Response: The edgesoftware version, build date, and target OS.
List the Package Modules
• Command:
./edgesoftware list
• Response: The modules installed and status.
14
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
./edgesoftware download
• Response: All available modules in that package are downloaded.
15
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
./edgesoftware log
• Response: CLI event log information, such as:
• target system information (hardware and software)
• system health
• installation status
• modules you can install
./edgesoftware list
16
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
./edgesoftware install
NOTE On a fresh Linux* installation, you might need to use the install command at least once
before performing an update. install makes sure all dependencies and packages are installed on the
target system.
./edgesoftware install
17
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
During the installation, you are prompted to enter your product key. The product key is in the email message
you received from Intel confirming your Edge Insights for Autonomous Mobile Robots download.
./edgesoftware list -d
18
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
./edgesoftware export
./edgesoftware uninstall -a
NOTE This command does not uninstall Docker* Compose and Docker* Community Edition (CE).
19
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Troubleshooting
If the following error is encountered:
Hardware Prerequisite
You have one of these AAEON* robot kits:
• UPS 6000 Robotic Kit
• UP Xtreme i11 Robotic Kit
This tutorial uses the UP Xtreme i11 Robotic Kit.
If you need help assembling your robot, see AAEON* Resources.
You can use one of these teleop methods to validate that the robot kit hardware setup was done correctly.
• Robot Teleop Using a Gamepad
Start at step 2 (insert the USB dongle in the robot); and, for step 5, run the yml file exactly as shown in
the example (ignore the instruction to replace it with your own generic yml file).
• Robot Teleop Using a Keyboard
For step 2, instead of customizing your file, use the exact command in the example.
NOTE The full-sdk docker image is only present in the Robot Complete Kit, not in the Robot Base Kit
or Up Xtreeme i11 Robotic Kit.
20
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
amr-aaeon-amr-interface
amr-ros-base
amr-imu-tools
amr-robot-localization
amr-realsense
amr-collab-slam
amr-collab-slam-gpu
amr-nav2
wandering
NOTE If these images are not installed, continuing with these steps triggers a build that takes
longer than an hour (sometimes, a lot longer depending on the system resources and internet
connection).
2. If these images are not installed, Intel recommends checking your installation with FastMapping
Algorithm or installing the Robot Complete Kit with the Get Started Guide for Robots.
mkdir ~/imu_cal
sudo chmod 0777 ~/imu_cal
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
2. Start the ros2_amr_interface:
21
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
"linear_accel_z: /=-nan"
To fix this, restart the aaeon node:
• Ang_vel_x in gyro: x
• Ang_vel_y in gyro: y
• Ang_vel_z in gyro: z
• linear_accel_x in accelerometer: x
• linear_accel_y in accelerometer: y
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
gedit 01_docker_sdk_env/artifacts/01_amr/amr_generic/param/aaeon_node_params.yaml
# Replace the values from this file to the ones you got in the previous step.
imu:
frame_id: imu_link
offsets:
accelerometer:
x: 0.399657
y: -0.460118
z: 0.0
gyro:
x: -0.00301162
y: -0.00446097
z: 0.00168006
22
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE Indentation is important in yaml files, so make sure to align offsets with frame_id. If the
indentation is incorrect, the container reports an error when started.
5. Verify that the changes are correctly aligned and that the aaeon-amr-interface node can start:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml up aaeon-amr-interface
Expected results:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml down --remove-orphans
Map an Area with the Wandering Application and UP Xtreme i11 Robotic Kit
The goal of the wandering application is to map an area and avoid hitting objects.
1. Place the robot in an area with multiple objects in it.
2. Go to the installation folder of Edge_Insights_for_Autonomous_Mobile_Robots:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
# The following command makes sure you have correct permissions so that collaborative SLAM can
save the map
sudo find . -type d -exec chmod 775 {} +
sudo chown $USER:$USER * -R
3. Start mapping the area:
23
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE If available, use a different development machine because rviz2 consumes a lot of resources
that may interfere with the robot.
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
rviz_robot_wandering.yml up
24
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
25
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE Displaying in 3D consumes a lot of the system resources. Intel recommends opening rviz2 on a
development system. The development system needs to be in the same network and have the same
ROS_DOMAIN_ID set.
26
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
6. To stop the robot from mapping the area, do one of the following:
27
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml down --remove-orphans
If the robot moves in an unpredictable way and hits objects easily, there may be some hardware
configuration issues. See the Troubleshooting section for suggestions.
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_localization_realsense_collab_slam_nav2_ukf.tutorial.yml up
Expected result: The robot starts moving in the already mapped area and reports if it is able to localize
itself or not.
• If the robot is able to localize itself, the amr-collab-slam node increases the tracking success
number, and the “relocal fail number” stays constant:
28
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• If the robot is not able to localize itself, the amr-collab-slam node keeps the tracking success
number constant and increases the “relocal fail number”:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
rviz_robot_localization.yml up
29
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
30
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE If the robot is not able to localize itself, the robot does not start navigating the room, and rviz2
reports map not found. To avoid this, move the robot to the room where the map was created and face
it towards a keypoint. It also helps if the room you mapped has a lot of keypoints
31
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Perform Object Detection While Mapping an Area with the UP Xtreme i11 Robotic Kit
1. Place the robot in an area with multiple objects in it.
2. Go to the installation folder of Edge_Insights_for_Autonomous_Mobile_Robots:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
3. Start mapping the area and listing the detected objects:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering_local_inference.yml up |grep Label
Expected result: The robot starts wandering around the room and listing what objects it sees:
AAEON* Resources
• Development Kit: https://github.com/up-board/up-community/wiki/UP-Robotic-Development-Kit-QSG
• Hardware Assembly: https://github.com/up-board/up-community/wiki/UP-Robotic-Development-Kit-HW-
Assembly-Guide
• Power Management: https://github.com/up-board/up-community/wiki/UP-Robotic-Development-Kit-
Power-Management-Guide
32
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Troubleshooting
If the server fails to load the map:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
sudo find . -type d -exec chmod 775 {} +
sudo chown $USER:$USER * -R
If the tracker (univloc_tracker_ros) fails to start, giving the following error, see Collaborative Visual SLAM
on Intel® Atom® Processor-Based Systems.
amr-collab-slam | [ERROR] [univloc_tracker_ros-2]: process has died [pid 140, exit code -4, cmd
'/home/eiforamr/workspace/CollabSLAM/prebuilt_collab_slam_core/univloc_tracker/lib/
univloc_tracker/univloc_tracker_ros --ros-args -r __node:=univloc_tracker_0 -r __ns:=/ --params-
file /tmp/launch_params_zfr70odz -r /tf:=tf -r /tf_static:=tf_static -r /univloc_tracker_0/
map:=map'].
If the robot does not start moving, the firmware might be stuck. To make it work again:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
./run_interactive_docker.sh amr-aaeon-amr-interface:2022.3 eiforamr -c aaeon_robot
ros2 run ros2_amr_interface amr_interface_node --ros-args -p try_reconnect:=true -p
publishTF:=true --remap /amr/cmd_vel:=/cmd_vel -p port_name:=/dev/ttyUSB0
ctrl-c
ros2 run ros2_amr_interface amr_interface_node --ros-args -p try_reconnect:=true -p
publishTF:=true --remap /amr/cmd_vel:=/cmd_vel -p port_name:=/dev/ttyUSB0
# Look for the text: [INFO] [1655311131.144572706] [AMR_node]: Hardware is now online
# If you don't get this repeat the commands from the docker image and check if the motor
controller is not attached to /dev/ttyUSB0.
# If it is not attached to /dev/ttyUSB0, find out which one it is and adapt the commands
accordingly.
# When you get the [INFO] [1655311131.144572706] [AMR_node]: Hardware is now online, exit the
docker image:
exit
If the robot is not behaving as instructed when using the teleop_twist_keyboard, try the following steps.
1. Check the direction of the wheels. The way they are facing is very important, as shown in the following
picture.
33
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Each wheel says R (Right) or L (Left). Intel had to use the following wheel setup:
R (wheel) <<<>>> L (wheel)
L (wheel) <<<>>> R (wheel)
2. Check the connection between the wheels (left in the following picture) and the motor controller.
34
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
It is very important to have the hardware setup correctly configured. If it is not correct, it is evident
when testing with the teleop_twist_keyboard.
3. If the wheels do not turn at all, there may be something wrong with the wheel motor control. The
board’s datasheet states that it takes a 12 V input. Intel found that a 12.5 V input did not work, but 5V,
8V, and 10V inputs do work.
If the IMU gives errors and you did not install the librealsense udev rules when you configured the host,
install the librealsense udev rules now:
35
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Hardware Requirements
The robot base should contain:
• Intel® compute system with Edge Insights for Autonomous Mobile Robots installed
• Intel® RealSense™ camera
• Robot base support (chassis) for the Intel® compute system and the Intel® RealSense™ camera
• Wheels
• Motor
• Motor controller
• Batteries for all components
Software Requirements
The robot base should have a ROS 2 node capable of:
• Publishing information from the motor controller firmware into ROS 2 topics like wheel odometry
• Getting information from other ROS 2 nodes and transmitting this data to the motor controller firmware,
for example, receiving movement commands from ROS 2 Navigation 2 stack on the cmd_vel topic
• Providing robot specific information like the tf tree data with correct tf transformations, for example, odom
and base_link and their transformations
• When using multiple robots, it is useful to be able to change these names for each robot like
robot1_odom and robot1_base_link (more information can be found here)
NOTE This ROS 2 node runs on the compute system and gets information from the motor robot
controller via a wire connection, usually a USB connection.
36
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• The Intel® RealSense™ camera USB to the Intel® compute system
• The motor controller to the Intel® compute system
The best way to integrate with Edge Insights for Autonomous Mobile Robots is to create a Docker* image for
the robot base node.
A robot base node ROS 2 node can also be started outside of a Docker* container, but Intel recommends
creating one and adding it to the Edge Insights for Autonomous Mobile Robots to have a complete pipeline of
all components needed for autonomous navigation in one yaml file.
If started outside of the SDK, use the same ROS_DOMAIN_ID for both the robot base node and the rest of
the pipeline.
For how to to create a Docker* image for the robot base node, see Create New Docker* Images with
Selected Applications from the SDK.
• odom is used by the Navigation 2 package and others to get information from sensors, especially the
wheel encoders. See this Navigation 2 tutorial on odometry for more information.
• base_link represents the center of the robot to which all other links are connected.
• Creates the transform between odom and base_link
• Is subscribed to cmd_vel which is used by the Navigation 2 package to give instructions to the robot like
spin in place or move forward
In Edge Insights for Autonomous Mobile Robots, there are two examples:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
ls 01_docker_sdk_env/artifacts/01_amr/amr_generic/param/pengo_nav.param.yaml
ls 01_docker_sdk_env/artifacts/01_amr/amr_generic/param/aaeon_node_params.yaml
# These are configuration files that the robot base nodes of Pengo and UP Xtreme i11 robotic
kits.
ls 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml
ls 01_docker_sdk_env/docker_compose/05_tutorials/
pengo_wandering__kobuki_realsense_collab_slam_fm_nav2.tutorial.yml
# These are yaml files that start the full pipeline that makes these robots wander and area and
map it.
# In them you can find how each nodes are started.
One is for AAEON’s UP Xtreme i11 Robotic Kit, and the other is for Cogniteam’s Pengo robot.
NOTE The following commands only work only if they are run on Cogniteam’s Pengo robot or AAEON’s
UP Xtreme i11 Robotic Kit.
37
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Using the Cogniteam’s Pengo robot and AAEON’s UP Xtreme i11 Robotic Kit as references and starting their
node like this:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
# for Cogniteam's Pengo robot
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
pengo_wandering__kobuki_realsense_collab_slam_fm_nav2.tutorial.yml up kobuki
# or for UP Xtreme i11 Robotic Kit
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml up aaeon-amr-interface
In a different terminal, attach to the opened Docker* image:
38
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
ls 01_docker_sdk_env/artifacts/01_amr/amr_generic/param/aaeon_nav.param.yaml 01_docker_sdk_env/
artifacts/01_amr/amr_generic/param/pengo_nav.param.yaml
One is for AAEON’s UP Xtreme i11 Robotic Kit, and the other is for Cogniteam’s Pengo robot.
39
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
You can also compare the two nav.param.yaml files that are in Edge Insights for Autonomous Mobile Robots
to understand which parameters are different from robot to robot:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
meld 01_docker_sdk_env/artifacts/01_amr/amr_generic/param/aaeon_nav.param.yaml 01_docker_sdk_env/
artifacts/01_amr/amr_generic/param/pengo_nav.param.yaml
The most important parameters to set are:
• use_sim_time:
Ignore all differences. It is used in simulations. Set it to False when running in a real environment.
• base_frame_id: robot frame ID being published by the robot base node
The default is base_footprint, but base_link is another option. Use the one you choose to publish in
the robot base node.
• robot_model_type: robot type
The options are omnidirectional, differential, or a custom motion model that you provide.
• tf_broadcast: turns transform broadcast on or off
Set this to False to prevent amcl from publishing the transform between the global frame and the
odometry frame.
• odom_topic: source of instantaneous speed measurement
• max_vel_x, max_vel_theta: maximum speed on the X axis or angular (theta)
• robot_radius: radius of the robot
• inflation_radius: radius to inflate costmap around lethal obstacles
• min_obstacle_height: minimum height to add return to occupancy grid
• max_obstacle_height: maximum height to add return to occupancy grid
• obstacle_range: determines the maximum range sensor reading that results in an obstacle being put
into the costmap
• max_rotational_vel, min_rotational_vel, rotational_acc_lim: configure the rotational velocity
allowed for the base in radians/second
cp 01_docker_sdk_env/artifacts/01_amr/amr_generic/param/aaeon_nav.param.yaml 01_docker_sdk_env/
artifacts/01_amr/amr_generic/param/generic_robot_nav.param.yaml
# Replace generic_robot_nav to a name that makes sense to your robotic kit.
# When replacing this make sure that it is also replaced wherever it is also used.
# It is used in the general pipeline yaml file that starts all components.
gedit 01_docker_sdk_env/artifacts/01_amr/amr_generic/param/generic_robot_nav.param.yaml
# you can also use any other preferred editor, it is important though to keep the path.
# Make all changes that are specific to your Robotic Kit, see the previous chapter where it was
described in details.
40
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• ros-base-camera-tf: Uses static_transform_publisher to create transforms between base_link
and camera_link
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
ls 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml
ls 01_docker_sdk_env/docker_compose/05_tutorials/
pengo_wandering__kobuki_realsense_collab_slam_fm_nav2.tutorial.yml
One is for AAEON’s UP Xtreme i11 Robotic Kit, and the other is for Cogniteam’s Pengo robot.
cp 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml 01_docker_sdk_env/
docker_compose/05_tutorials/
generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml
# Replace generic_robot_nav to a name that makes sense to your robotic kit.
gedit 01_docker_sdk_env/docker_compose/05_tutorials/
generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml
# you can also use any other preferred editor, it is important though to keep the path.
Make all of the changes that are specific to your robotic kit:
1. Replace the aaeon-amr-interface target with the generic robot node you created in Step 3: Robot
Base Node ROS 2 Node.
2. Remove the ros-base-teleop target because this is specific to AAEON’s UP Xtreme i11 Robotic Kit.
3. In the ROS 2 command file, change the Navigation 2 target so that params_file targets the
parameter file you created in Step 4: Robot Base Node ROS 2 Navigation Parameter File.
from: params_file:=${CONTAINER_BASE_PATH}/01_docker_sdk_env/artifacts/01_amr/
amr_generic/param/aaeon_nav.param.yaml
to: params_file:=${CONTAINER_BASE_PATH}/01_docker_sdk_env/artifacts/01_amr/
amr_generic/param/generic_robot_nav.param.yaml
4. In the ros-base-camera-tf target, change the transform values from
static_transform_publisher. The values for x, y, and z depend on where your Intel® RealSense™
camera is set.
41
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
3. Start mapping the area:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml up
Expected result: The robot starts wandering around the room and mapping the entire area.
4. On a different terminal, prepare the environment to visualize the mapping and the robot using rviz2.
NOTE If available, use a different development machine because rviz2 consumes a lot of resources
that may interfere with the robot.
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
rviz_robot_wandering.yml up
42
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
43
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE Displaying in 3D consumes a lot of system resources. Intel recommends opening rviz2 on a
development system. The development system needs to be in the same network and have the same
ROS_DOMAIN_ID set.
44
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
6. To stop the robot from mapping the area, do one of the following:
45
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml down --remove-orphans
Perception
The following tutorials offer solutions for sensors and computer vision in ROS 2-based Docker* containers.
NOTE If one or both of the images are not installed, continuing with these steps triggers a build
that takes longer than an hour (sometimes, a lot longer depending on the system resources and
internet connection).
3. If the image is not installed, Intel recommends installing the Robot Complete Kit with the Get Started
Guide for Robots.
4. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
5. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=12
6. Run the command below to start the Docker* container:
realsense-viewer
In the Intel® RealSense™ viewer, if any firmware update is available, a window popup appears in
the upper right corner.
46
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
b. During the firmware update installation, do not disconnect the Intel® RealSense™ camera. Press
Install in the window popup.
c. After the installation is complete or if no update is available, close the Intel® RealSense™ viewer.
d. Exit the Docker* image:
exit
8. Run an automated yml file that opens the Intel® RealSense™ ROS 2 node and lists camera-relevant
information.
47
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
48
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
9. To close this, do one of the following:
• Type Ctrl-c in the terminal where you did the up command.
• Run this command in another terminal:
Troubleshooting
In some cases, the stream may not appear due to permission issues on the host. You may see this error
message:
If the problem persists, you can try any or all of the following:
• Verify that $DISPLAY has the correct value.
• Perform an Intel® RealSense™ hardware reset:
This tutorial tells you how to run the segmentation demo application on both a static image and on a video
stream received from a Intel® RealSense™ camera.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
49
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
2. If the image is not installed, Intel recommends installing the Robot Base Kit or Robot Complete Kit with
the Get Started Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=16
5. Launch the automated execution of the ROS 2 OpenVINO™ toolkit sample applications:
50
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
51
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
b. Execution of the object segmentation sample code input from the Intel® RealSense™ camera topic:
This requires a Intel® RealSense™ camera connected to the testing target. It takes one minute,
and you can see the semantic segmentation being applied to the video stream received from a
Intel® RealSense™ camera.
6. To close this, do one of the following:
• Type Ctrl-c in the terminal where you did the up command.
• Run this command in another terminal:
CHOOSE_USER=eiforamr 01_docker_sdk_env/docker_compose/05_tutorials/ros2_openvino.tutorial.yml
down
How it Works
All of the commands required to run this tutorial are documented in:
01_docker_sdk_env/docker_compose/05_tutorials/ros2_openvino.tutorial.yml
To use your own image to run semantic segmentation:
1. Copy your image into the AMR_containers folder at:
cp <path_to_image>/my_image.jpg 01_docker_sdk_env/docker_compose/05_tutorial/param/
2. Edit 01_docker_sdk_env/docker_compose/05_tutorials/ros2_openvino.tutorial.yml, at line
34, adding the following command:
cp ${CONTAINER_BASE_PATH}/01_docker_sdk_env/docker_compose/05_tutorials/param/my_image.jpg ../
ros2_ws/src/ros2_openvino_toolkit/data/images/
3. Edit 01_docker_sdk_env/docker_compose/05_tutorials/param/
pipeline_segmentation_image.yaml to change the input_path:, line 4:
input_path: /home/eiforamr/ros2_ws/src/ros2_openvino_toolkit/data/images/my_image.jpg
4. Run the automated yml:
Troubleshooting
For general robot issues, go to: Troubleshooting for Robot Tutorials.
52
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Complete Kit with the Get Started
Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=22
5. Run inference engine object detection on a pre-trained network using the Single-Shot multibox
Detection (SSD) method. Run the detection demo application for a CPU:
53
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
54
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Expected output: A video in a loop with cars being detected and labeled by the Neural Network using a
GPU
9. To close this, do one of the following:
• Type Ctrl-c in the terminal where you did the up command.
• Run this command in another terminal:
55
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE Only execute this command on systems with an Intel® Movidius™ Myriad™ X accelerator.
Check your system:
lsusb
Look for Intel Movidius MyriadX in the output.
56
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE There is a known issue that if you choose to run the object_detection_demo using the –d
MYRIAD option, a core dump error is thrown when the demo ends.
If errors occur, remove the following file and try again:
rm -rf /tmp/mvnc.mutex
Troubleshooting
If running the yml file gets stuck at downloading:
gedit 01_docker_sdk_env/docker_compose/05_tutorials/openvino_CPU.tutorial.yml
# In the same way open any other yml you want to test behind a proxy.
Add the following lines after echo echo “*** Set up the OpenVINO environment ***”, replacing http://
<http_proxy>:port with your actual environment http_proxy.
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
For general robot issues, go to: Troubleshooting for Robot Tutorials.
57
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE If one or both of the images are not installed, continuing with these steps triggers a build
that takes longer than an hour (sometimes, a lot longer depending on the system resources and
internet connection).
4. If one or both of the images are not installed, Intel recommends installing the Robot Base Kit or Robot
Complete Kit with the Get Started Guide for Robots.
5. Run the Sample Application.
a. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
b. Prepare the docker_compose environment:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=114
c. Get the Slamtec* RPLIDAR serial port:
export RPLIDAR_SERIAL_PORT=/dev/ttyUSB0
# this value may differ from system to system, use the value returned in the previous step
f. Run the Slamtec* RPLIDAR tutorial:
58
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
59
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE If one or both of the images are not installed, continuing with these steps triggers a build
that takes longer than an hour (sometimes, a lot longer depending on the system resources and
internet connection).
4. If one or both of the images are not installed, Intel recommends installing the Robot Complete Kit with
the Get Started Guide for Robots.
5. Run the Sample Application.
a. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
b. Prepare the docker_compose environment:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=114
# Export the Sick NanoScan3 IP and the host's IP
export HOST_IP=<host_ip>
export SICK_NANOSCAN_IP=<sick_nanoscan_ip>
c. Run the SICK* nanoScan3* laser scanner tutorial:
60
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
61
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
62
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
63
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Troubleshooting
For general robot issues, go to: Troubleshooting for Robot Tutorials.
GStreamer* Pipelines
This tutorial tells you how to set up and run a GStreamer* video pipeline using Sony’s IMX390 MIPI sensor.
Prerequisites: To enable Sony’s IMX390 MIPI sensor, you must use the Resource Design Center (RDC), have
a Corporate Non-Disclosure Agreement (CNDA) in place, and ask for download access.
uname -r
Step 1 is only valid for updating from kernel versions 5.4, 5.8, 5.11 and 5.13. If you have a different kernel,
go to the Support Forum.
This process can take from 30 minutes to two hours, depending on your system.
1. Clone Intel’s Linux* LTS kernel 5.10.131 repository from GitHub*.
sudo apt-get -y install build-essential gcc bc bison flex libssl-dev libncurses5-dev libelf-dev
dwarves zstd
4. Copy the configuration file to your folder, and rename it .config.
64
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
scripts/config --module CONFIG_VIDEO_AR0234
scripts/config --module CONFIG_PINCTRL_TIGERLAKE
scripts/config --enable CONFIG_INTEL_IPU6_TGLRVP_PDATA
7. Compile the kernel, and make the Debian* kernel packages.
NOTE
make olddefconfig
make -j4 deb-pkg
8. Install the new Debian* kernel packages.
cd /tmp
sudo update-grub
10. Reboot your system.
sync
sudo reboot -fn
11. Check your kernel version after reboot.
uname -r
unzip 645460.zip
tar -xf ipu6_rpm_beta.tar.bz2
cp -r rpm /tmp
3. Run the Docker* image as root:
xhost +
./run_interactive_docker.sh amr-gstreamer:<TAG> root -e "--volume /sys/kernel/:/sys/kernel:rw --
volume /sys/class:/sys/class:rw"
4. If your network runs behind proxies, export the corresponding proxies in the container.
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
5. Install the RPM package in the Docker* container:
apt-get update
apt-get install rpm
65
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
6. Set isys_freq:
export DISPLAY=:0 #If you are on VNC adapt this value to the correct one
export PKG_CONFIG_PATH=/usr/local/lib/pkgconfig:/usr/lib64/pkgconfig:/usr/lib/pkgconfig
export LD_LIBRARY_PATH=/usr/local/lib:/usr/lib64:/usr/lib
export GST_PLUGIN_PATH=/usr/lib/gstreamer-1.0
export GST_GL_PLATFORM=egl
Troubleshooting
For general robot issues, go to: Troubleshooting for Robot Tutorials.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Base Kit or Robot Complete Kit with
the Get Started Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=34
5. Run an automated yml file that opens a GStreamer* sample application inside the EI for AMR Docker*
container.
66
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Expected output:
cp test.ogg ${CONTAINER_BASE_PATH}/01_docker_sdk_env/docker_compose/05_tutorials/test.ogg
And update line 26 to:
Troubleshooting
For general robot issues, go to: Troubleshooting for Robot Tutorials.
Run a GStreamer* video pipeline using GStreamer* plugins, and display a video file in a Docker* container
window.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Base Kit or Robot Complete Kit with
the Get Started Guide for Robots.
67
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=31
5. Run an automated yml file that opens a GStreamer* sample application inside the EI for AMR Docker*
container.
68
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
69
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
cp test.mp4 ${CONTAINER_BASE_PATH}/01_docker_sdk_env/docker_compose/05_tutorials/test.mp4
And update line 27 to:
Troubleshooting
If running the yml file gets stuck at downloading:
gedit 01_docker_sdk_env/docker_compose/05_tutorials/gstreamer_video.tutorial.yml
# In the same way open any other yml you want to test behind a proxy.
Add the following lines after “echo “*** Run gst-launch with video sample from the Docker container ***”,
replacing http://<http_proxy>:port with your actual environment http_proxy.
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
Check your system date and time:
date
If the date is incorrect, contact your local support team for help setting the correct date and time.
For general robot issues, go to: Troubleshooting for Robot Tutorials.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
3. If the image is not installed, Intel recommends installing the Robot Base Kit or Robot Complete Kit with
the Get Started Guide for Robots.
4. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
70
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
5. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
sudo chmod a+rw /dev/video*
6. Get the stream from the webcam using GStreamer*:
Troubleshooting
If the following error is encountered:
Run a GStreamer* video pipeline using the Intel® RealSense™ plugin in a Docker* container in order to use a
Intel® RealSense™ video camera as the video source.
71
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
3. If the image is not installed, Intel recommends installing the Robot Base Kit or Robot Complete Kit with
the Get Started Guide for Robots.
4. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
5. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=45
sudo chmod a+rw /dev/video*
6. Get the stream from the Intel® RealSense™ camera using gstreamer:
Troubleshooting
• In some cases, the stream may not appear due to permission issues on the host. You may see this error
message:
If the problem persists, you can try any or all of the following:
72
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• Verify that $DISPLAY has the correct value.
• Perform a Intel® RealSense™ hardware reset:
Point Cloud Library (PCL) Optimized for the Intel® oneAPI Base Toolkit
Collateral Description
“Getting Started” on the Point Cloud Library site A high-level overview of PCL
vim oneapi_octree_search.cpp
3. Place the following inside the file:
#include <iostream>
#include <fstream>
#include <numeric>
#include <pcl/oneapi/octree/octree.hpp>
#include <pcl/oneapi/containers/device_array.h>
#include <pcl/point_cloud.h>
73
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
queries.resize(query_size);
radiuses.resize(query_size);
for (i = 0; i < query_size; ++i)
{
queries[i].x = ((float)rand())/(float)RAND_MAX * cube_size;
queries[i].y = ((float)rand())/(float)RAND_MAX * cube_size;
queries[i].z = ((float)rand())/(float)RAND_MAX * cube_size;
radiuses[i] = ((float)rand())/(float)RAND_MAX * max_radius;
};
indices.resize(query_size / 2);
for(i = 0; i < query_size / 2; ++i)
{
indices[i] = i * 2;
}
//oneAPI build
pcl::oneapi::Octree octree_device;
octree_device.setCloud(cloud_device);
octree_device.build();
74
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
//oneAPI octree radius search with shared radius
octree_device.radiusSearch(queries_device, shared_radius, max_answers, result_device1);
//oneAPI octree radius search with shared radius using indices to specify
//the queries.
pcl::oneapi::Octree::Indices cloud_indices;
cloud_indices.upload(indices);
octree_device.radiusSearch(queries_device, cloud_indices, shared_radius, max_answers,
result_device3);
//Download results
std::vector<int> sizes1;
std::vector<int> sizes2;
std::vector<int> sizes3;
result_device1.sizes.download(sizes1);
result_device2.sizes.download(sizes2);
result_device3.sizes.download(sizes3);
int query_idx = 2;
std::cout << "Neighbors within shared radius search at ("
<< queries[query_idx].x << " "
<< queries[query_idx].y << " "
<< queries[query_idx].z << ") with radius=" << shared_radius << std::endl;
for (i = 0; i < sizes1[query_idx]; ++i)
{
std::cout << " " << points[downloaded_buffer1[max_answers * query_idx + i]].x
<< " " << points[downloaded_buffer1[max_answers * query_idx + i]].y
<< " " << points[downloaded_buffer1[max_answers * query_idx + i]].z
<< " (distance: " << dist(points[downloaded_buffer1[max_answers * query_idx +
i]], queries[query_idx]) << ")" << std::endl;
}
75
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
76
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
add_executable (${target} oneapi_octree_search.cpp)
target_link_libraries (${target} sycl pcl_oneapi_containers pcl_oneapi_octree pcl_octree)
6. Source the Intel® oneAPI Base Toolkit environment:
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/octree/
mkdir build && cd build
cmake ../
make -j
8. Run the binary:
./oneapi_octree_search
Expected results example:
Neighbors within shared 660.296 725.957 439.677 (distance: 29.8829) 665.768 721.884 442.919
radius search at (671.675 (distance: 26.7846) 683.988 714.608 445.164 (distance: 30.9962) 677.927
733.78 466.178) with 725.08 446.531 (distance: 22.3788) 695.066 723.509 445.762 (distance:
radius=34.1333 32.7028)
Neighbors within 672.71 736.679 447.835 (distance: 18.6) 664.46 731.504 452.074 (distance:
individual radius search at 16.0048) 671.238 725.881 461.408 (distance: 9.23819) 667.707 718.527
(671.675 733.78 466.178) 466.622 (distance: 15.7669) 654.552 733.636 467.795 (distance: 17.1993)
with radius=19.3623
Neighbors within indices 660.296 725.957 439.677 (distance: 29.8829) 665.768 721.884 442.919
radius search at (671.675 (distance: 26.7846) 683.988 714.608 445.164 (distance: 30.9962) 677.927
733.78 466.178) with 725.08 446.531 (distance: 22.3788) 695.066 723.509 445.762 (distance:
radius=34.1333 32.7028)
The search only finds the first five neighbors (as specified by max_answers), so a different radius finds
different points.
Code Explanation
Generate point cloud data, queries, radiuses, indices with a random number.
queries.resize(query_size);
radiuses.resize(query_size);
for (i = 0; i < query_size; ++i)
{
queries[i].x = ((float)rand())/(float)RAND_MAX * cube_size;
queries[i].y = ((float)rand())/(float)RAND_MAX * cube_size;
queries[i].z = ((float)rand())/(float)RAND_MAX * cube_size;
radiuses[i] = ((float)rand())/(float)RAND_MAX * max_radius;
};
77
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
indices.resize(query_size / 2);
for(i = 0; i < query_size / 2; ++i)
{
indices[i] = i * 2;
Create and build the Intel® oneAPI Base Toolkit point cloud; then upload the queries and radiuses to a Intel®
oneAPI Base Toolkit device.
//oneAPI build
pcl::oneapi::Octree octree_device;
octree_device.setCloud(cloud_device);
octree_device.build();
//oneAPI octree radius search with shared radius using indices to specify
//the queries.
pcl::oneapi::Octree::Indices cloud_indices;
cloud_indices.upload(indices);
octree_device.radiusSearch(queries_device, cloud_indices, shared_radius, max_answers,
result_device3);
78
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Download the search results from the Intel® oneAPI Base Toolkit device. The size vector contains the size of
found neighbors for each query. The downloaded_buffer vector contains the index of all found neighbors for
each query.
//Download results
std::vector<int> sizes1;
std::vector<int> sizes2;
std::vector<int> sizes3;
result_device1.sizes.download(sizes1);
result_device2.sizes.download(sizes2);
result_device3.sizes.download(sizes3);
int query_idx = 2;
std::cout << "Neighbors within shared radius search at ("
<< queries[query_idx].x << " "
<< queries[query_idx].y << " "
<< queries[query_idx].z << ") with radius=" << shared_radius << std::endl;
for (i = 0; i < sizes1[query_idx]; ++i)
{
std::cout << " " << points[downloaded_buffer1[max_answers * query_idx + i]].x
<< " " << points[downloaded_buffer1[max_answers * query_idx + i]].y
<< " " << points[downloaded_buffer1[max_answers * query_idx + i]].z
<< " (distance: " << dist(points[downloaded_buffer1[max_answers * query_idx +
i]], queries[query_idx]) << ")" << std::endl;
}
79
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
The Intel® oneAPI Base Toolkit optimization for convex hull is ported from CUDA optimization code, and the
result of the Intel® oneAPI Base Toolkit optimization matches the result of the CUDA implementation.
However, it does not match the result of the PCL CPU implementation.
This tutorial shows how to use these optimizations inside a Docker* image. For the same functionality
outside of Docker* images, see PCL Optimizations Outside of Docker* Images.
1. Prepare the environment:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
./run_interactive_docker.sh eiforamr-full-flavour-sdk:2022.3 root -c full_flavor
mkdir convex_hull && cd convex_hull
2. Create the file oneapi_convex_hull.cpp:
vim oneapi_convex_hull.cpp
3. Place the following inside the file:
#include <pcl/oneapi/surface/convex_hull.h>
#include <pcl/io/pcd_io.h>
#include <pcl/PolygonMesh.h>
#include <pcl/surface/convex_hull.h>
#include <pcl/visualization/pcl_visualizer.h>
PseudoConvexHull3D pch(1e5);
PseudoConvexHull3D::Cloud cloud_device;
PseudoConvexHull3D::Cloud convex_device;
cloud_device.upload(cloud_ptr->points);
pch.reconstruct(cloud_device, convex_device);
pcl::PolygonMesh mesh;
pcl::ConvexHull<pcl::PointXYZ> ch;
ch.setInputCloud(convex_ptr);
ch.reconstruct(mesh);
80
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
vis.resetCamera ();
vis.spin ();
}
4. Create a CMakeLists.txt file:
vim CMakeLists.txt
5. Place the following inside the file:
cmake_minimum_required(VERSION 3.5 FATAL_ERROR)
set(target oneapi_convex_hull)
set(CMAKE_CXX_COMPILER dpcpp)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_FLAGS "-Wall -Wpedantic -Wno-unknown-pragmas -Wno-pass-failed -Wno-unneeded-
internal-declaration -Wno-unused-function -Wno-gnu-anonymous-struct -Wno-nested-anon-types -Wno-
extra-semi -Wno-unused-local-typedef -fsycl -fsycl-unnamed-lambda -ferror-limit=1")
project(${target})
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/convex_hull/
mkdir build && cd build
cmake ../
make -j
8. Download the test data from GitHub*:
wget https://raw.githubusercontent.com/PointCloudLibrary/pcl/master/test/bun0.pcd
# if the binary is not downloaded try setting the proxies first and try again:
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
9. Run the binary:
./oneapi_convex_hull ./bun0.pcd
Expected results: a convex hull triangle mesh
Code Explanation
Load the test data from GitHub* into a PointCloud<PointXYZ>.
81
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Create GPU input and output device arrays, and load point cloud data into the input device array.
PseudoConvexHull3D::Cloud cloud_device;
PseudoConvexHull3D::Cloud convex_device;
cloud_device.upload(cloud_ptr->points);
GPU: Perform reconstruction, and generate convex hull vertices.
pch.reconstruct(cloud_device, convex_device);
Download the convex hull vertices from the GPU to the CPU.
convex_device.download(convex_ptr->points);
CPU: Perform reconstruction to generate the convex hull mesh.
pcl::PolygonMesh mesh;
pcl::ConvexHull<pcl::PointXYZ> ch;
ch.setInputCloud(convex_ptr);
ch.reconstruct(mesh);
Visualize the convex hull mesh results.
This tutorial shows how to use these optimizations inside a Docker* image. For the same functionality
outside of Docker* images, see PCL Optimizations Outside of Docker* Images.
1. Prepare the environment:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
./run_interactive_docker.sh eiforamr-full-flavour-sdk:2022.3 root -c full_flavor
mkdir sample_consensus && cd sample_consensus
2. Create the file oneapi_sample_consensus.cpp:
vim oneapi_sample_consensus.cpp
3. Place the following inside the file:
/*
* Software License Agreement (BSD License)
*
* Point Cloud Library (PCL) - www.pointclouds.org
* Copyright (c) 2010-2012, Willow Garage, Inc.
* Copyright (c) 2014-, Open Perception, Inc.
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* * Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* * Redistributions in binary form must reproduce the above
82
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
* copyright notice, this list of conditions and the following
* disclaimer in the documentation and/or other materials provided
* with the distribution.
* * Neither the name of the copyright holder(s) nor the names of its
* contributors may be used to endorse or promote products derived
* from this software without specific prior written permission.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
* LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE
* COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING,
* BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN
* ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
* POSSIBILITY OF SUCH DAMAGE.
*
*/
#include <pcl/oneapi/sample_consensus/sac_model_plane.h>
#include <pcl/oneapi/sample_consensus/ransac.h>
#include <pcl/io/pcd_io.h>
#include <pcl/point_types.h>
#include <pcl/point_cloud.h>
// Algorithm tests
typename pcl::oneapi::SampleConsensusModelPlane::Ptr sac_model (new
pcl::oneapi::SampleConsensusModelPlane (cloud_device));
pcl::oneapi::RandomSampleConsensus sac (sac_model);
sac.setMaxIterations (10000);
sac.setDistanceThreshold (0.03);
result = sac.computeModel ();
// Best model
pcl::oneapi::SampleConsensusModelPlane::Indices sample;
sac.getModel (sample);
83
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
// Coefficient
pcl::oneapi::SampleConsensusModelPlane::Coefficients coeffs;
sac.getModelCoefficients (coeffs);
// Inliers
pcl::Indices pcl_inliers;
int inliers_size = sac.getInliersSize ();
pcl_inliers.resize(inliers_size);
// Refined coefficient
pcl::oneapi::SampleConsensusModelPlane::Coefficients coeff_refined;
sac_model->optimizeModelCoefficients (*cloud_ptr, pcl_inliers, coeffs, coeff_refined);
// print log
std::cout << "input cloud size: " << cloud_ptr->points.size() << std::endl;
std::cout << "inliers size : " << inliers_size << std::endl;
std::cout << " plane model coefficient: " << coeffs[0] << ", " << coeffs[1] << ", " <<
coeffs[2] << ", " << coeffs[3] << std::endl;
std::cout << " Optimized coefficient : " << coeff_refined[0] << ", " << coeff_refined[1] <<
", " << coeff_refined[2] << ", " << coeff_refined[3] << std::endl;
}
4. Create a CMakeLists.txt file:
vim CMakeLists.txt
5. Place the following inside the file:
cmake_minimum_required(VERSION 3.5 FATAL_ERROR)
set(target oneapi_sample_consensus)
set(CMAKE_CXX_COMPILER dpcpp)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_FLAGS "-Wall -Wpedantic -Wno-unknown-pragmas -Wno-pass-failed -Wno-unneeded-
internal-declaration -Wno-unused-function -Wno-gnu-anonymous-struct -Wno-nested-anon-types -Wno-
extra-semi -Wno-unused-local-typedef -fsycl -fsycl-unnamed-lambda -ferror-limit=1")
project(${target})
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/sample_consensus/
mkdir build && cd build
cmake ../
make -j
84
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
8. Download the test data from GitHub*:
wget https://raw.githubusercontent.com/PointCloudLibrary/data/
5c26bdd0591ba150b91858b5c9fe5e91cb39ae86/segmentation/mOSD/test/test59.pcd
# if the binary is not downloaded try setting the proxies first and try again:
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
9. Run the binary:
./oneapi_sample_consensus ./test59.pcd
Expected results example:
input cloud size: 307200
inliers size : 77316
plane model coefficient: -0.0789502, -0.816661, -0.571692, 0.546386
Optimized coefficient : -0.0722213, -0.818286, -0.570256, 0.547587
Code Explanation
Load the test data from GitHub* into a PointCloud<PointXYZ>.
pcl::oneapi::SampleConsensusModelPlane::Indices sample;
sac.getModel (sample);
Result (coefficient model):
pcl::oneapi::SampleConsensusModelPlane::Coefficients coeffs;
sac.getModelCoefficients (coeffs);
Result (inliers model):
pcl::Indices pcl_inliers;
int inliers_size = sac.getInliersSize ();
pcl_inliers.resize(inliers_size);
85
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
pcl::oneapi::SampleConsensusModelPlane::Coefficients coeff_refined;
sac_model->optimizeModelCoefficients (*cloud_ptr, pcl_inliers, coeffs, coeff_refined);
Result (output log):
std::cout << "input cloud size: " << cloud_ptr->points.size() << std::endl;
std::cout << "inliers size : " << inliers_size << std::endl;
std::cout << " plane model coefficient: " << coeffs[0] << ", " << coeffs[1] << ", " <<
coeffs[2] << ", " << coeffs[3] << std::endl;
std::cout << " Optimized coefficient : " << coeff_refined[0] << ", " << coeff_refined[1] <<
", " << coeff_refined[2] << ", " << coeff_refined[3] << std::endl;
This tutorial shows how to use these optimizations inside a Docker* image. For the same functionality
outside of Docker* images, see PCL Optimizations Outside of Docker* Images.
1. Prepare the environment:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
./run_interactive_docker.sh eiforamr-full-flavour-sdk:2022.3 root -c full_flavor
mkdir one_api_segmentation && cd one_api_segmentation
2. Create the file oneapi_segmentation.cpp:
vim oneapi_segmentation.cpp
3. Place the following inside the file:
#include <pcl/oneapi/segmentation/segmentation.h>
#include <pcl/io/pcd_io.h>
#include <pcl/point_types.h>
#include <pcl/pcl_config.h>
86
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
//Optional
seg.setOptimizeCoefficients(true);
//Set algorithm method and model type
seg.setMethodType(pcl::oneapi::SAC_RANSAC);
seg.setModelType (pcl::oneapi::SACMODEL_PLANE);
std::cout << "input cloud size : " << seg.getCloudSize() << std::endl;
std::cout << "inliers size : " << seg.getInliersSize() << std::endl;
std::cout << "model coefficients : " << coeffs[0] << ", " << coeffs[1] << ", " << coeffs[2]
<< ", " << coeffs[3] << std::endl;
return 0;
}
4. Create a CMakeLists.txt file:
vim CMakeLists.txt
5. Place the following inside the file:
cmake_minimum_required(VERSION 3.5 FATAL_ERROR)
set(target oneapi_segmentation)
set(CMAKE_CXX_COMPILER dpcpp)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_FLAGS "-Wall -Wpedantic -Wno-unknown-pragmas -Wno-pass-failed -Wno-unneeded-
internal-declaration -Wno-unused-function -Wno-gnu-anonymous-struct -Wno-nested-anon-types -Wno-
extra-semi -Wno-unused-local-typedef -fsycl -fsycl-unnamed-lambda -ferror-limit=1")
project(${target})
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/one_api_segmentation/
mkdir build && cd build
cmake ../
make -j
87
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Code Explanation
Load the test data from GitHub* into a PointCloud<PointXYZ>.
pcl::oneapi::SACSegmentation seg;
Configure the oneapi_segmentation class.
seg.setInputCloud(cloud_input);
seg.setProbability(0.99);
seg.setMaxIterations(50);
seg.setDistanceThreshold(0.01);
//Optional
seg.setOptimizeCoefficients(true);
//Set algorithm method and model type
seg.setMethodType(pcl::oneapi::SAC_RANSAC);
seg.setModelType (pcl::oneapi::SACMODEL_PLANE);
Set to true if a coefficient refinement is required.
seg.setOptimizeCoefficients(true);
Set the algorithm method and model type.
seg.setMethodType(pcl::oneapi::SAC_RANSAC);
seg.setModelType (pcl::oneapi::SACMODEL_PLANE);
Declare output parameters for getting inliers and model coefficients.
seg.segment(*inliers, coeffs);
Result (output log):
std::cout << "input cloud size : " << seg.getCloudSize() << std::endl;
std::cout << "inliers size : " << seg.getInliersSize() << std::endl;
std::cout << "model coefficients : " << coeffs[0] << ", " << coeffs[1] << ", " << coeffs[2]
<< ", " << coeffs[3] << std::endl;
88
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Surface Reconstruction with Intel® oneAPI Base Toolkit's Moving Least Squares (MLS)
MLS creates a 3D surface from a point cloud through either down-sampling or up-sampling. Intel® oneAPI
Base Toolkit's MLS is based on the original MLS API. Differences between the two:
• Intel® oneAPI Base Toolkit's MLS calculates with 32-bit float instead of 64-bit double.
• Intel® oneAPI Base Toolkit's MLS’s surface is constructed as a set of indices grouped into multiple blocks.
This consumes more system memory than the original version. Control the block size with
setSearchBlockSize.
• Intel® oneAPI Base Toolkit's MLS improves the performance of all up-sampling methods.
• The Intel® oneAPI Base Toolkit namespace must be appended to the original MovingLeastSquares class.
See resampling.rst for details.
This tutorial shows how to use these optimizations inside a Docker* image. For the same functionality
outside of Docker* images, see PCL Optimizations Outside of Docker* Images.
1. Prepare the environment:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
./run_interactive_docker.sh eiforamr-full-flavour-sdk:2022.3 root -c full_flavor
mkdir mls && cd mls
2. Create the file oneapi_mls.cpp:
vim oneapi_mls.cpp
3. Place the following inside the file:
#include <pcl/oneapi/surface/mls.h>
#include <pcl/oneapi/search/kdtree.h>
#include <pcl/point_types.h>
#include <pcl/io/pcd_io.h>
// Output has the PointNormal type in order to store the normals calculated by MLS
pcl::PointCloud<pcl::PointNormal> mls_points;
// Init object (second point type is for the normals, even if unused)
pcl::oneapi::MovingLeastSquares<pcl::PointXYZ, pcl::PointNormal> mls;
mls.setComputeNormals (true);
// Set parameters
mls.setInputCloud (cloud_ptr);
mls.setPolynomialOrder (2);
mls.setSearchMethod (tree);
mls.setSearchRadius (0.03);
// Reconstruct
89
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
mls.process (mls_points);
// Save output
pcl::io::savePCDFile ("bun0-mls.pcd", mls_points);
}
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/mls/
mkdir build && cd build
cmake ../
make -j
8. Download the test data from GitHub*:
wget https://raw.githubusercontent.com/PointCloudLibrary/pcl/master/test/bun0.pcd
# if the binary is not downloaded try setting the proxies first and try again:
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
9. Run the binary:
./oneapi_mls ./bun0.pcd
To see the smoothed cloud (zoom out to see the reconstructed shape):
/home/eiforamr/workspace/lib/pcl/bin/pcl_viewer bun0-mls.pcd
Code Explanation
Intel® oneAPI Base Toolkit's MLS requires this header.
#include <pcl/point_types.h>
#include <pcl/io/pcd_io.h>
90
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Load the test data from GitHub* into a PointCloud<PointXYZ> (these fields are mandatory; other fields are
allowed and preserved).
mls.setComputeNormals (true);
Append the Intel® oneAPI Base Toolkit namespace to the original MovingLeastSquares class. The first
template type is for the input and output cloud. Only the XYZ dimensions of the input are smoothed in the
output.
mls.setPolynomialOrder (2);
If the normal and original dimensions need to be in the same cloud, the fields have to be concatenated.
// Save output
pcl::io::savePCDFile ("bun0-mls.pcd", mls_points);
ICP is an algorithm employed to minimize the difference between two clouds of points. The standard, not
joint nor generalized, ICP has been optimized using the Intel® oneAPI Base Toolkit.
See registration_api for details.
This tutorial shows how to use these optimizations inside a Docker* image. For the same functionality
outside of Docker* images, see PCL Optimizations Outside of Docker* Images.
1. Prepare the environment:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
./run_interactive_docker.sh eiforamr-full-flavour-sdk:2022.3 root -c full_flavor
mkdir one_api_registration && cd one_api_registration
2. Create the file oneapi_icp_example.cpp:
vim oneapi_icp_example.cpp
3. Place the following inside the file:
#include <pcl/oneapi/registration/icp.h>
#include <pcl/console/parse.h>
#include <pcl/point_types.h>
#include <pcl/point_cloud.h>
#include <pcl/point_representation.h>
#include <pcl/io/pcd_io.h>
/* ---[ */
int
main (int argc, char** argv)
91
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
{
// Parse the command line arguments for .pcd files
std::vector<int> p_file_indices;
p_file_indices = parse_file_extension_argument (argc, argv, ".pcd");
if (p_file_indices.size () != 2)
{
print_error ("Need one input source PCD file and one input target PCD file to continue.\n");
print_error ("Example: %s source.pcd target.pcd\n", argv[0]);
return (-1);
}
PointCloud<PointXYZ> output;
// Compute the best transformtion
pcl::oneapi::IterativeClosestPoint<PointXYZ, PointXYZ> reg;
reg.setMaximumIterations(20);
reg.setTransformationEpsilon(1e-12);
reg.setMaxCorrespondenceDistance(2);
reg.setInputSource(src);
reg.setInputTarget(tgt);
// Register
reg.align(output); //point cloud output of alignment i.e source cloud after transformation is
applied.
92
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
find_package(PCL 1.12 REQUIRED)
find_package(PCL-ONEAPI 1.12 REQUIRED)
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/one_api_registration/
mkdir build && cd build
cmake ../
make -j
8. Download the test data from GitHub*:
wget https://raw.githubusercontent.com/NVIDIA-AI-IOT/cuPCL/main/cuOctree/test_P.pcd
wget https://raw.githubusercontent.com/NVIDIA-AI-IOT/cuPCL/main/cuOctree/test_Q.pcd
# if the binaries is not downloaded try setting the proxies first and try again:
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
9. Run the binary:
./oneapi_icp_example test_P.pcd test_Q.pcd
Expected results example:
Transform Matrix:
0.998899 0.0107221 0.0457259 0.0790768
-0.00950837 0.999602 -0.0266773 0.0252976
-0.0459936 0.026213 0.998599 0.0677631
0 0 0 1
Code Explanation
Define two input point Clouds (src, tgt), declare the output point cloud, and load the test data from GitHub*.
PointCloud<PointXYZ> output;
93
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Declare the Intel® oneAPI Base Toolkit's ICP, and set the input configuration parameters.
reg.setInputSource(src);
reg.setInputTarget(tgt);
// Register
reg.align(output); //point cloud output of alignment i.e source cloud after transformation is
applied.
Get the computed matrix transformation, print it, and save the transformed point cloud.
Intel® oneAPI Base Toolkit's KdTree is similar to pcl::KdTreeFLANN except that Intel® oneAPI Base Toolkit's
KdTree is able to search the entire point cloud in a single call. Intel® oneAPI Base Toolkit's KdTree returns a
two-dimensional vector of indices and distances for the entire point cloud search. Both nearestKSearch and
radiusSearch are supported.
See kdtree_search for details.
This tutorial shows how to use these optimizations inside a Docker* image. For the same functionality
outside of Docker* images, see PCL Optimizations Outside of Docker* Images.
1. Prepare the environment:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
./run_interactive_docker.sh eiforamr-full-flavour-sdk:2022.3 root -c full_flavor
mkdir kdtree && cd kdtree
2. Create the file oneapi_kdtree.cpp:
vim oneapi_kdtree.cpp
3. Place the following inside the file:
#include <pcl/oneapi/search/kdtree.h> // for KdTree
#include <pcl/point_cloud.h>
#include <vector>
#include <iostream>
int
main (int argc, char** argv)
{
srand (time (NULL));
94
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
// Generate pointcloud data
cloud->width = 1000;
cloud->height = 1;
cloud->points.resize (cloud->width * cloud->height);
pcl::oneapi::search::KdTree<pcl::PointXYZ> kdtree;
kdtree.setInputCloud (cloud);
95
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
pointsRadiusSquaredDistance, 10);
return 0;
}
4. Create a CMakeLists.txt file:
vim CMakeLists.txt
5. Place the following inside the file:
cmake_minimum_required(VERSION 3.5 FATAL_ERROR)
set(target oneapi_kdtree)
set(CMAKE_CXX_COMPILER dpcpp)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_FLAGS "-Wall -Wpedantic -Wno-unknown-pragmas -Wno-pass-failed -Wno-unneeded-
internal-declaration -Wno-unused-function -Wno-gnu-anonymous-struct -Wno-nested-anon-types -Wno-
extra-semi -Wno-unused-local-typedef -fsycl -fsycl-unnamed-lambda -ferror-limit=1")
project(${target})
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/kdtree/
mkdir build && cd build
cmake ../
make -j
8. Run the binary:
./oneapi_kdtree
Expected results example:
96
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
K=5 neighbors from 895.824 185.838 555.685 (squared distance: 2259.57) 877.741 116.544
(868.536,165.24,588.71) 656.683 (squared distance: 7076.32) 906.9 102.777 515.255 (squared distance:
10769.1) 817.828 258.588 546.829 (squared distance: 13039.3) 767.465
164.017 644.785 (squared distance: 13361.4)
K=5 neighbors from 219.717 774.625 586.534 (squared distance: 2934.17) 137.948 729.391
(169.766,772.676,607.39 630.512 (squared distance: 3420.39) 211.662 726.615 591.354 (squared
5) distance: 4134.23) 241.534 720.475 579.8 (squared distance: 8637.12) 236.382
811.854 548.706 (squared distance: 9417.13)
K=5 neighbors from 1001.23 881.896 424.253 (squared distance: 2485.66) 1002.05 882.791
(974.478,854.754,392.10 460.627 (squared distance: 6241.17) 980.62 864.419 471.809 (squared
8) distance: 6483.47) 891.607 840.559 334.935 (squared distance: 10337.9)
875.04 824.699 399.918 (squared distance: 10852.3)
Radius=100 neighbors 895.824 185.838 555.685 (squared distance: 2259.57) 877.741 116.544
from 656.683 (squared distance: 7076.32)
(868.536,165.24,588.71)
Radius=100 neighbors 219.717 774.625 586.534 (squared distance: 2934.17) 137.948 729.391
from 630.512 (squared distance: 3420.39) 211.662 726.615 591.354 (squared
(169.766,772.676,607.39 distance: 4134.23) 241.534 720.475 579.8 (squared distance: 8637.12) 236.382
5) 811.854 548.706 (squared distance: 9417.13)
Radius=100 neighbors 1001.23 881.896 424.253 (squared distance: 2485.66) 1002.05 882.791
from 460.627 (squared distance: 6241.17) 980.62 864.419 471.809 (squared
(974.478,854.754,392.10 distance: 6483.47)
8)
Code Explanation
Intel® oneAPI Base Toolkit's KdTree requires this header.
#include <pcl/point_cloud.h>
Seed rand() with the system time, create and fill a point cloud with random data (cloud), create and fill
another point cloud with random coordinates (searchPoints), and search three coordinates using a single
call.
97
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
pcl::oneapi::search::KdTree<pcl::PointXYZ> kdtree;
kdtree.setInputCloud (cloud);
Create an integer ‘K’ and set it equal to five, and create two two-dimensional vectors for storing our K
nearest neighbors from the search.
98
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
for (std::size_t i = 0; i < pointsIdxRadiusSearch.at(j).size(); ++i)
{
std::cout << " " << (*cloud)[ pointsIdxRadiusSearch.at(j)[i] ].x
<< " " << (*cloud)[ pointsIdxRadiusSearch.at(j)[i] ].y
<< " " << (*cloud)[ pointsIdxRadiusSearch.at(j)[i] ].z
<< " (squared distance: " << pointsRadiusSquaredDistance.at(j)[i] << ")" <<
std::endl;
}
}
This tutorial shows how to use these optimizations inside a Docker* image. For the same functionality
outside of Docker* images, see PCL Optimizations Outside of Docker* Images.
1. Prepare the environment:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
./run_interactive_docker.sh eiforamr-full-flavour-sdk:2022.3 root -c full_flavor
mkdir voxel_grid && cd voxel_grid
2. Create the file oneapi_voxel_grid.cpp:
vim oneapi_voxel_grid.cpp
3. Place the following inside the file:
#include <pcl/oneapi/filters/voxel_grid.h>
#include <pcl/io/pcd_io.h>
#include <pcl/point_types.h>
#include <pcl/point_cloud.h>
// GPU calculate
pcl::oneapi::VoxelGrid vg_oneapi;
vg_oneapi.setInputCloud(cloud_device);
float leafsize= 0.005f;
99
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
// print log
std::cout << "[oneapi voxel grid] PointCloud before filtering: " << cloud_device.size() <<
std::endl;
std::cout << "[oneapi voxel grid] PointCloud after filtering: " << cloud_device_o.size() <<
std::endl;
}
4. Create a CMakeLists.txt file:
vim CMakeLists.txt
5. Place the following inside the file:
cmake_minimum_required(VERSION 3.5 FATAL_ERROR)
set(target oneapi_voxel_grid)
set(CMAKE_CXX_COMPILER dpcpp)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_FLAGS "-Wall -Wpedantic -Wno-unknown-pragmas -Wno-pass-failed -Wno-unneeded-
internal-declaration -Wno-unused-function -Wno-gnu-anonymous-struct -Wno-nested-anon-types -Wno-
extra-semi -Wno-unused-local-typedef -fsycl -fsycl-unnamed-lambda -ferror-limit=1")
project(${target})
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/voxel_grid/
mkdir build && cd build
cmake ../
make -j
8. Download the test data from GitHub*:
wget https://raw.githubusercontent.com/PointCloudLibrary/data/
5c26bdd0591ba150b91858b5c9fe5e91cb39ae86/tutorials/table_scene_lms400.pcd
# if the binary is not downloaded try setting the proxies first and try again:
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
9. Run the binary:
./oneapi_voxel_grid
Expected results example:
[oneapi voxel grid] PointCloud before filtering: 460400
[oneapi voxel grid] PointCloud after filtering: 41049
100
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Code Explanation
Load the test data from GitHub* into a PointCloud<PointXYZ>.
// GPU calculate
pcl::oneapi::VoxelGrid vg_oneapi;
vg_oneapi.setInputCloud(cloud_device);
float leafsize= 0.005f;
vg_oneapi.setLeafSize (leafsize, leafsize, leafsize);
vg_oneapi.filter(cloud_device_o);
Result (output log):
// print log
std::cout << "[oneapi voxel grid] PointCloud before filtering: " << cloud_device.size() <<
std::endl;
std::cout << "[oneapi voxel grid] PointCloud after filtering: " << cloud_device_o.size() <<
std::endl;
This tutorial shows how to use these optimizations inside a Docker* image. For the same functionality
outside of Docker* images, see PCL Optimizations Outside of Docker* Images.
1. Prepare the environment:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
./run_interactive_docker.sh eiforamr-full-flavour-sdk:2022.3 root -c full_flavor
mkdir passthrough && cd passthrough
2. Create the file oneapi_passthrough.cpp:
vim oneapi_passthrough.cpp
3. Place the following inside the file:
#include <pcl/oneapi/filters/passthrough.h>
#include <pcl/io/pcd_io.h>
#include <pcl/point_types.h>
101
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
#include <pcl/point_cloud.h>
// GPU calculate
pcl::oneapi::PassThrough ps;
ps.setInputCloud(cloud_device);
ps.setFilterFieldName ("z");
ps.setFilterLimits (0.0, 1.0);
ps.filter(cloud_device_o);
// print log
std::cout << "[oneapi passthrough] PointCloud before filtering: " << cloud_device.size() <<
std::endl;
std::cout << "[oneapi passthrough] PointCloud after filtering: " << cloud_device_o.size() <<
std::endl;
}
4. Create a CMakeLists.txt file:
vim CMakeLists.txt
5. Place the following inside the file:
cmake_minimum_required(VERSION 3.5 FATAL_ERROR)
set(target oneapi_passthrough)
set(CMAKE_CXX_COMPILER dpcpp)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_FLAGS "-Wall -Wpedantic -Wno-unknown-pragmas -Wno-pass-failed -Wno-unneeded-
internal-declaration -Wno-unused-function -Wno-gnu-anonymous-struct -Wno-nested-anon-types -Wno-
extra-semi -Wno-unused-local-typedef -fsycl -fsycl-unnamed-lambda -ferror-limit=1")
project(${target})
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
102
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
add_executable (${target} oneapi_passthrough.cpp)
target_link_libraries (${target} sycl pcl_oneapi_filters ${PCL_LIBRARIES})
6. Source the Intel® oneAPI Base Toolkit environment:
export PATH=/home/eiforamr/workspace/lib/pcl/share/pcl-1.12:/home/eiforamr/workspace/lib/pcl/
share/pcl-oneapi-1.12:$PATH
source /opt/intel/oneapi/setvars.sh
7. Build the code:
cd /home/eiforamr/workspace/passthrough/
mkdir build && cd build
cmake ../
make -j
8. Download the test data from GitHub*:
wget https://raw.githubusercontent.com/PointCloudLibrary/data/
5c26bdd0591ba150b91858b5c9fe5e91cb39ae86/tutorials/kinfu_large_scale/
using_kinfu_large_scale_output.pcd
# if the binary is not downloaded try setting the proxies first and try again:
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
9. Run the binary:
./oneapi_passthrough
Expected results example:
[oneapi passthrough] PointCloud before filtering: 993419
[oneapi passthrough] PointCloud after filtering: 328598
Code Explanation
Load the test data from GitHub* into a PointCloud<PointXYZ>.
// GPU calculate
pcl::oneapi::PassThrough ps;
ps.setInputCloud(cloud_device);
103
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
ps.setFilterFieldName ("z");
ps.setFilterLimits (0.0, 1.0);
ps.filter(cloud_device_o);
Result (output log):
// print log
std::cout << "[oneapi passthrough] PointCloud before filtering: " << cloud_device.size() <<
std::endl;
std::cout << "[oneapi passthrough] PointCloud after filtering: " << cloud_device_o.size() <<
std::endl;
Prerequisites:
• Ubuntu 20.04 Desktop
• An 11th Generation Intel® Core™ microprocessor
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
2. Install the bundle:
unzip edge_insights_for_amr.zip
cd edge_insights_for_amr
chmod 775 edgesoftware
sudo groupadd docker
sudo usermod -aG docker $USER
newgrp docker
./edgesoftware install
104
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Configure the Host System
1. After installing the bundle, open a new terminal and:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_PCL
./install.sh
sudo apt install vim -y
# logout, and close this terminal
exit
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_PCL
source /opt/intel/oneapi/setvars.sh
mkdir octree && cd octree
2. Create the test file:
vim oneapi_octree_search.cpp
3. Place the following inside the file:
#include <iostream>
#include <fstream>
#include <numeric>
#include <pcl/oneapi/octree/octree.hpp>
#include <pcl/oneapi/containers/device_array.h>
#include <pcl/point_cloud.h>
105
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
queries.resize(query_size);
radiuses.resize(query_size);
for (i = 0; i < query_size; ++i)
{
queries[i].x = ((float)rand())/(float)RAND_MAX * cube_size;
queries[i].y = ((float)rand())/(float)RAND_MAX * cube_size;
queries[i].z = ((float)rand())/(float)RAND_MAX * cube_size;
radiuses[i] = ((float)rand())/(float)RAND_MAX * max_radius;
};
indices.resize(query_size / 2);
for(i = 0; i < query_size / 2; ++i)
{
indices[i] = i * 2;
}
//oneAPI build
pcl::oneapi::Octree octree_device;
octree_device.setCloud(cloud_device);
octree_device.build();
//oneAPI octree radius search with shared radius using indices to specify
//the queries.
pcl::oneapi::Octree::Indices cloud_indices;
cloud_indices.upload(indices);
octree_device.radiusSearch(queries_device, cloud_indices, shared_radius, max_answers,
result_device3);
106
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
//oneAPI octree KNN search
//if neighbor points distances results are not required, can just call
//octree_device.nearestKSearchBatch(queries_device, k, result_device_knn)
octree_device.nearestKSearchBatch(queries_device, k, result_device_knn, dists_device_knn);
//Download results
std::vector<int> sizes1;
std::vector<int> sizes2;
std::vector<int> sizes3;
result_device1.sizes.download(sizes1);
result_device2.sizes.download(sizes2);
result_device3.sizes.download(sizes3);
int query_idx = 2;
std::cout << "Neighbors within shared radius search at ("
<< queries[query_idx].x << " "
<< queries[query_idx].y << " "
<< queries[query_idx].z << ") with radius=" << shared_radius << std::endl;
for (i = 0; i < sizes1[query_idx]; ++i)
{
std::cout << " " << points[downloaded_buffer1[max_answers * query_idx + i]].x
<< " " << points[downloaded_buffer1[max_answers * query_idx + i]].y
<< " " << points[downloaded_buffer1[max_answers * query_idx + i]].z
<< " (distance: " << dist(points[downloaded_buffer1[max_answers * query_idx +
i]], queries[query_idx]) << ")" << std::endl;
}
107
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
vim CMakeLists.txt
5. Place the following inside the file:
include_directories(${PCL_INCLUDE_DIRS} ${PCL-ONEAPI_INCLUDE_DIRS})
link_directories(${PCL_LIBRARY_DIRS} ${PCL-ONEAPI_LIBRARY_DIRS})
add_definitions(${PCL_DEFINITIONS} ${PCL-ONEAPI_DEFINITIONS})
./oneapi_octree_search
NOTESurface Reconstruction with Intel® oneAPI Base Toolkit's Moving Least Squares (MLS) uses the
pcl_viewer from the Docker* environment. Outside the Docker* environment, run pcl_viewer as:
pcl_viewer bun0-mls.pcd
108
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Optimized PCL Known Limitation
JIT Limitation
Intel’s PCL optimization is implemented with the Intel® oneAPI DPC++ Compiler. The Intel® oneAPI DPC++
Compiler converts a DPC++ program into an intermediate language called SPIR-V (Standard Portable
Intermediate Representation). The SPIR-V code is stored in the binary produced by the compilation process.
The SPIR-V code has the advantage that it can be run on any hardware platform by translating the SPIR-V
code into the assembly code of the given platform at runtime. This process of translating the intermediate
code present in the binary is called Just-In-Time (JIT) compilation. Since JIT compilation happens at the
beginning of the execution of the first offloaded kernel, the performance is impacted. This issue can be
mitigated by setting the system environment variable to cache and reuse JIT-compiled binaries.
1. Set the system environment variable to cache and reuse JIT-compiled binaries.
export SYCL_CACHE_PERSISTENT=1
2. Set the environment variable permanently.
NOTE To get an accurate PCL optimization performance number, this system environment variable
needs to be set, and the program needs to be executed once to generate and cache the JIT-compiled
binaries.
If the executable gives a segmentation fault (the core is dumped), the Docker* image was not opened with
the root user.
If you can see the GPU in sycl, the user has the correct permissions:
sycl-ls
[opencl:0] ACC : Intel(R) FPGA Emulation Platform for OpenCL(TM) 1.2 [2021.13.11.0.23_160000]
[opencl:0] CPU : Intel(R) OpenCL 3.0 [2021.13.11.0.23_160000]
[opencl:0] GPU : Intel(R) OpenCL HD Graphics 3.0 [22.17.23034]
[level_zero:0] GPU : Intel(R) Level-Zero 1.3 [1.3.23034]
[host:0] HOST: SYCL host platform 1.2 [1.2]
If the user does not have the correct permissions, add the user to the render group:
#replace userName with the actual user of your system
sudo usermod -a -G render <userName>
Navigation
The following tutorials tell you how to use ROS 2 components developed by Intel to help an EI for AMR
navigate and map a room and provide teleop options.
109
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Collaborative visual SLAM is compiled natively for both Intel® Core™ and Intel® Atom® processor-based
systems. By default, in this tutorial, the Intel® Core™ processor-based system version is used. If you are
running an Intel® Atom® processor-based system, you must make the changes detailed in Collaborative
Visual SLAM on Intel® Atom® Processor-Based Systems for collaborative visual SLAM to work.
• Collaborative Visual SLAM with Two Robots: uses as input two ROS 2 bags that simulate two robots
exploring the same area
• The ROS 2 tool rviz2 is used to visualize the two robots, the server, and how the server merges the
two local maps of the robots into one common map.
• The output includes the estimated pose of the camera and visualization of the internal map.
• All input and output are in standard ROS 2 formats.
• Collaborative Visual SLAM with FastMapping Enabled: uses as an input a ROS 2 bag that simulates a robot
exploring an area
• Collaborative visual SLAM has the FastMapping algorithm integrated.
• For more information on FastMapping, see How it Works.
• The ROS 2 tool rviz2 is used to visualize the robot exploring the area and how FastMapping creates the
2D and 3D maps.
• Collaborative Visual SLAM with GPU Offloading
• Offloading to the GPU only works on systems with 11th Generation Intel® Core™ processors with Intel®
Iris® Xe Integrated Graphics.
• Collaborative Visual SLAM also contains mapping and can operate in localization mode.
• Map an Area with the Wandering Application and UP Xtreme i11 Robotic Kit
• Start the UP Xtreme i11 Robotic Kit in Localization Mode
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends re-installing the EI for AMR Robot Kit with the Get
Started Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=17
5. If the bags were not extracted before, do it now:
110
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
6. Run the collaborative visual SLAM algorithm using two bags simulating two robots going through the
same area:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
2. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=17
3. If the bags were not extracted before, do it now:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/collab-slam-
fastmapping.tutorial.yml up
Expected result: On the opened rviz2, you see the visual SLAM keypoints, the 3D map, and the 2D
map.
5. You can disable the /univloc_tracker_0/local_map, univloc_tracker_0/fused_map, or both
topics.
Visible Test: Showing keypoints, the 3D map, and the 2D map
Expected Result:
111
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
112
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Visible Test: ``Showing the 3D map
Expected Result:
113
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
114
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Visible Test: Map showing the 2D map
Expected Result:
115
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
116
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Visible Test: Showing keypoints and the 2D map
Expected Result:
117
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
118
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Complete Kit with the Get Started
Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=17
5. If the bags were not extracted before, do it now:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/collab-slam-gpu.tutorial.yml up
Expected result: On the opened rviz2, you see the visual SLAM keypoints, the 3D map, and the 2D map
119
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
120
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
7. On a different terminal, check how much of the GPU is using intel-gpu-top.
8. To close this execution, close the rviz2 window, and press Ctrl-c in the terminal.
9. Clean up the Docker* images:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/collab-slam-gpu.tutorial.yml
down --remove-orphans
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
gedit 01_docker_sdk_env/docker_compose/05_tutorials/<collab-slam-tutorial>.tutorial.yml
2. Replace this line:
source /home/eiforamr/workspace/ros_entrypoint.sh
With these lines:
unset CMAKE_PREFIX_PATH
unset AMENT_PREFIX_PATH
unset LD_LIBRARY_PATH
source /home/eiforamr/workspace/CollabSLAM/prebuilt_collab_slam_atom/setup.bash
Troubleshooting
• If the tracker (univloc_tracker_ros) fails to start, giving this error:
amr-collab-slam | [ERROR] [univloc_tracker_ros-2]: process has died [pid 140, exit code -4, cmd
'/home/eiforamr/workspace/CollabSLAM/prebuilt_collab_slam_core/univloc_tracker/lib/
univloc_tracker/univloc_tracker_ros --ros-args -r __node:=univloc_tracker_0 -r __ns:=/ --params-
file /tmp/launch_params_zfr70odz -r /tf:=tf -r /tf_static:=tf_static -r /univloc_tracker_0/
map:=map'].
See Collaborative Visual SLAM on Intel® Atom® Processor-Based Systems.
• The odometry feature use_odom:=true does not work with these bags.
The ROS 2 bags used in this example do not have the necessary topics recorded for the odometry feature
of collaborative visual SLAM.
If the use_odom:=true parameter is set, the collab-slam reports errors.
121
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Make sure that your local user has read and write access to this path: <path to
edge_insights_for_amr>//Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
01_docker_sdk_env/docker_compose/06_bags
The best way to do this is to make your user the owner of the folder. If the EI for AMR bundle was
installed with sudo, chown the folder to your local user.
• If the following error is encountered:
gedit 01_docker_sdk_env/docker_compose/05_tutorials/collab-slam-gpu.tutorial.yml
# Change the at the line 26 from 109, to the number you got above.
• For general robot issues, go to: Troubleshooting for Robot Tutorials.
This tutorial tells you how to run a Kudan Visual SLAM (KdVisual) system using a ROS 2 bag as the input
containing data of a robot exploring an area.
• The ROS 2 tool rviz2 is used to visualize how KdVisual interprets the data from the ROS 2 bag.
• Find more information about Kudan Visual SLAM here.
• Find more information on Kudan in general here.
122
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
This may be in Step 1 or Step 3 depending on the use case you selected.
e. Click Next until you get to Download.
f. Click Download.
g. During installation, you are prompted to enter your product key, so copy the product key
displayed on the download page.
2. Copy edge_insights_for_amr.zip from the developer workstation to the Home directory on your
target system. You can use a USB flash drive to copy the file.
123
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
sudo su
echo 'export http_proxy="http://<http_proxy>:port"' >> /etc/environment
echo 'export https_proxy="http://<https_proxy>:port"' >> /etc/environment
echo 'export ftp_proxy="http://<ftp_proxy>:port"' >> /etc/environment
echo 'export no_proxy="<no_proxy>"' >> /etc/environment
exit
source /etc/environment
NOTE These steps are needed only once per host. They do not have to be done for different users or
different logins of the same user.
sudo visudo
# Add after other lines that add Defaults:
Defaults env_keep += "ftp_proxy http_proxy https_proxy no_proxy"
2. Run the following commands to go to the directory, change permission of the executable edgesoftware
file, and install the bundle:
cd edge_insights_for_amr
chmod 775 edgesoftware
sudo groupadd docker
sudo usermod -aG docker $USER
newgrp docker
./edgesoftware install
3. Type the product key at the prompt:
NOTE The product key is displayed on the download page. Contact Support Forum if it is not.
124
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
4. Based on components selected and system configuration, you might be prompted for additional actions.
For example, if your system is behind a proxy, you are asked to enter proxy settings.
125
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
When the installation is complete, you see an installation complete message and the installation status
for each module.
126
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
127
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
5. If any of the installed modules report a failure in the Status column due to a break in the internet
connection or for any other reason, run the install again:
./edgesoftware install
6. Set the correct ownership:
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends repeating the Download the Optional Kudan Visual SLAM
(KdVisual) Bundle step.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=17
# If the bags were not extracted before, do it now
unzip 01_docker_sdk_env/docker_compose/06_bags.zip -d 01_docker_sdk_env/docker_compose/
sudo chmod 0777 -R 01_docker_sdk_env/docker_compose/06_bags
5. Specify that the Docker* engine uses Release 2022.3.1 of the amr-kudan-slam Docker* image:
export DOCKER_TAG=2022.3.1
6. Run the Kudan Visual SLAM algorithm using a ROS 2 bag simulating a robot exploring an area:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/kudan-slam-cpu.tutorial.yml up
In rviz2, you can see what KdVisual does with the input from the ROS 2 bag.
128
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
129
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
7. To close this execution, close the rviz2 window, and press Ctrl-c in the terminal.
8. Clean up the Docker* images:
NOTE Offloading to the GPU only works on systems with 11th Generation Intel® Core™ processors with
Intel® Iris® Xe Integrated Graphics.
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/kudan-slam-gpu.tutorial.yml up
10. On a different terminal, check how much of the GPU is used using intel-gpu-top:
11. To close this execution, close the rviz2 window, and press Ctrl-c in the terminal.
12. Cleanup the Docker* images:
Troubleshooting
If you encounter this error:
gedit 01_docker_sdk_env/docker_compose/05_tutorials/kudan-slam-gpu.tutorial.yml
# and/or
gedit 01_docker_sdk_env/docker_compose/05_tutorials/kudan-slam-cpu.tutorial.yml
# Change the at the line 26 from 109, to the number you got above.
If you encounter this error:
Failed to initialize SLAM, error 5 = The supplied license key is invalid or expired
130
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Update the EI for AMR software to Release 2022.3.1; see the Get Started Guide for Robots for software
install instructions.
If the Kudan Visual SLAM tutorial does not start within a few seconds and you see the message “Building
kudan-slam,” the Docker* engine started building the incorrect version of the amr-kudan-slam image.
Specify that the Docker* engine must use Release 2022.3.1 of the amr-kudan-slam Docker* image:
export DOCKER_TAG=2022.3.1
If you are building your own Kudan Visual SLAM Docker* image outside the amr-kudan-slam Docker* image,
retrieve the updated license file (intel_amr.kdlicense) from the amr-kudan-slam:2022.3.1 Docker*
image:
FastMapping Algorithm
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends re-installing the EI for AMR Robot Kit with the Get
Started Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=17
# If the bags were not extracted before do it now
unzip 01_docker_sdk_env/docker_compose/06_bags.zip -d 01_docker_sdk_env/docker_compose/
sudo chmod 0777 -R 01_docker_sdk_env/docker_compose/06_bags
5. Run the FastMapping Algorithm using a bag of a robot spinning:
131
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
132
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
6. To close this, do one of the following:
• Type Ctrl-c in the terminal where you did the up command.
• Run this command in another terminal:
Troubleshooting
For general robot issues, go to: Troubleshooting for Robot Tutorials.
ADBSCAN Algorithm
This tutorial tells you how to run the ADBSCAN algorithm from EI for AMR using 2D Slamtec* RPLIDAR and
Intel® RealSense™ camera input.
It outputs to the obstacle_array topic of type nav2_dynamic_msgs/ObstacleArray.
Prerequisites: You know how to connect and configure a Slamtec* RPLIDAR sensor. For details, see: 2D
LIDAR and ROS 2 Cartographer.
NOTE If one or both of the images are not installed, continuing with these steps triggers a build
that takes longer than an hour (sometimes, a lot longer depending on the system resources and
internet connection).
2. If one or both of the images are not installed, Intel recommends installing the Robot Base Kit or the
Robot Complete Kit with the Get Started Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=17
# Unzip the ros2 bags if they were not unzipped before
unzip 01_docker_sdk_env/docker_compose/06_bags.zip -d 01_docker_sdk_env/docker_compose/
sudo chmod 0777 -R 01_docker_sdk_env/docker_compose/06_bags
5. Depending on the Slamtec* RPLIDAR availability, you have two possibilities:
• Slamtec* RPLIDAR connected
1. Verify the udev rules that you configured for RPLIDAR in 2D LIDAR and ROS 2 Cartographer.
Get the
a. Slamtec* RPLIDAR serial port:
133
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Check
b. for similar logs:
export RPLIDAR_SERIAL_PORT=/dev/ttyUSB0
# this value may differ from system to system, use the value returned in the previous step
2. Start a pre-configured yml file that starts the LIDAR Node and then the ADBSCAN application:
134
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Run the ADBSCAN Algorithm with Intel® RealSense™ Camera Input
1. Check if your installation has the amr-adbscan and amr-realsense Docker* images.
NOTE If one or both of the images are not installed, continuing with these steps triggers a build
that takes longer than an hour (sometimes, a lot longer depending on the system resources and
internet connection).
2. If one or both of the images are not installed, Intel recommends installing the Robot Base Kit or Robot
Complete Kit with the Get Started Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=17
# Unzip the ros2 bags if they were not unzipped before
unzip 01_docker_sdk_env/docker_compose/06_bags.zip -d 01_docker_sdk_env/docker_compose/
sudo chmod 0777 -R 01_docker_sdk_env/docker_compose/06_bags
5. Depending on the Intel® RealSense™ camera availability, you have two possibilities:
• Intel® RealSense™ camera connected
Start a pre-configured yml file that starts the Intel® RealSense™ node and then the ADBSCAN
application:
135
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Troubleshooting
For general robot issues, go to: Troubleshooting for Robot Tutorials.
136
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
The ITS Plugin for the ROS 2 Navigation 2 application plugin is a global path planner module that is based on
Intelligent sampling and Two-way Search (ITS). It does not support continuous replanning.
Prerequisites: Use a simple behavior tree with a compute path to pose and a follow path.
ITS planner inputs:
• global 2D costmap (nav2_costmap_2d::Costmap2D)
• start and goal pose (geometry_msgs::msg::PoseStamped)
ITS planner outputs: 2D waypoints of the path
Path planning steps summary:
1. The ITS planner converts the 2D costmap to either a Probabilistic Road Map (PRM) or a Deterministic
Road Map (DRM).
2. The generated roadmap is saved as a txt file which can be reused for multiple inquiries.
3. The ITS planner conducts a two-way search to find a path from the source to the destination. Either the
smoothing filter or a catmull spline interpolation can be used to create a smooth and continuous path.
The generated smooth path is in the form of a ROS 2 navigation message type (nav_msgs::msg).
For customization options, see ITS Path Planner Plugin Customization.
Run the ROS 2 Navigation Sample Application Using ITS Path Planner
1. Check if your installation has the eiforamr-full-flavour-sdk Docker* image.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Complete Kit with the Get Started
Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
5. Start the ROS 2 navigation sample application using the TurtleBot* 3 Gazebo* simulation:
NOTE The above command opens Gazebo* and rviz2 applications. Gazebo* takes a longer time to
open (up to a minute) depending on the host’s capabilities. Both applications contain the simulated
waffle map, and a simulated robot. Initially, the applications are opened in the background, but you
can bring them into the foreground, side-by-side, for a better visual.
137
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
b. In rviz2, press Navigation2 Goal, and choose a destination for the robot. This calls the
behavioral tree navigator to go to that goal through an action server.
138
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
139
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Expected result: The robot moves along the path generated to its new destination.
c. Set new destinations for the robot, one at a time.
CHOOSE_USER=eiforamr 01_docker_sdk_env/docker_compose/05_tutorials/its_path_planner.tutorial.yml
down
Troubleshooting
For general robot issues, go to Troubleshooting for Robot Tutorials.
140
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
The ROS 2 navigation bringup application is started using the TurtleBot* 3 Gazebo* simulation, and it
receives as input parameter its_nav2_params.yaml.
Check the code snippet from 01_docker_sdk_env/docker_compose/05_tutorials/
its_path_planner.tutorial.yml:
export TURTLEBOT3_MODEL=waffle
export GAZEBO_MODEL_PATH=$GAZEBO_MODEL_PATH:/home/eiforamr/ros2_ws/install/turtlebot3_gazebo/
share/turtlebot3_gazebo/models/
ros2 launch nav2_bringup tb3_simulation_launch.py params_file:=${CONTAINER_BASE_PATH}/
01_docker_sdk_env/docker_compose/05_tutorials/param/its_nav2_params.yaml
To use the ITS path planner plugin, the following parameters are added in its_nav2_params.yaml:
planner_server:
ros__parameters:
expected_planner_frequency: 0.01
use_sim_time: True
planner_plugins: ["GridBased"]
GridBased:
plugin: "its_planner/ITSPlanner"
interpolation_resolution: 0.05
catmull_spline: False
smoothing_window: 15
buffer_size: 10
build_road_map_once: True
min_samples: 250
roadmap: "PROBABLISTIC"
w: 32
h: 32
n: 2
smoothing_window:
The window size for the smoothing filter (The unit is the grid size.)
buffer_size:
During roadmap generation, the samples are generated away from obstacles. The buffer size dictates how far
away from obstacles the roadmap samples should be.
build_road_map_once:
If true, the roadmap is loaded from the saved file; otherwise, a new roadmap is generated.
min_samples:
The minimum number of samples required to generate the roadmap
roadmap:
Either PROBABILISTIC or DETERMINISTIC
w:
141
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
h:
The height of the window for intelligent sampling
n:
The minimum number of samples that is required in an area defined by w and h
You can modify these values by editing the file below, at lines 251-267:
01_docker_sdk_env/docker_compose/05_tutorials/param/its_nav2_params.yaml
This tutorial tells you how to use the DPC++ compiler, convert CUDA to DPC++, build it, and run it in a
Docker* container.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Complete Kit with the Get Started
Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
5. Run the command below to start the Docker* container as root:
142
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
sudo apt-key adv --keyserver-options http-proxy=<http_proxy:port> --keyserver
keyserver.ubuntu.com --recv-key 204DD8AEC33A7AFF
sudo -E apt-get update -y --allow-unauthenticated && DEBIAN_FRONTEND=noninteractive sudo -E apt-
get install -y --no-install-recommends system76-cuda-10.1
7. The install command may fail to check if CUDA was installed:
ls /usr/lib/cuda*
Example output:
/usr/lib/cuda/:
EULA.txt NsightSystems-2018.3 cublas_version.txt extras jre libnsight
nsightee_plugins nvvm share targets version.txt
NsightCompute-2019.1 bin doc include lib64 libnvvp
nvml samples src tools
/usr/lib/cuda-10.1/:
EULA.txt NsightSystems-2018.3 cublas_version.txt extras jre libnsight
nsightee_plugins nvvm share targets version.txt
NsightCompute-2019.1 bin doc include lib64 libnvvp
nvml samples src tools
8. Set up the environment for Intel® oneAPI Base Toolkit:
source /opt/intel/oneapi/setvars.sh
9. Download a sample file that uses the DPC++ compiler:
wget -L https://raw.githubusercontent.com/intel/llvm/98b6ee437ed325992ace95548b0ffc01dd4cbbe9/
sycl/examples/simple-dpcpp-app.cpp -O simple.cpp
# if the binary is not downloaded try setting the proxies first and try again:
export http_proxy="http://<http_proxy>:port"
export https_proxy="http://<https_proxy>:port"
Run the command below and review the output binary:
143
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
root@edgesoftware:/home/eiforamr/data_samples#
b. Conversion successfully done:
ls
dpct_output vector_add.cu
c. Go to output directory:
cd /dpct_output
d. Create a simple Makefile with this content:
CXX = dpcpp
TARGET = vector_add
SRCS = vector_add.dp.cpp
# Use predefined implicit rules and add one for *.cpp files.
%.o: %.cpp
$(CXX) -c $(CXXFLAGS) $(CPPFLAGS) $< -o $@
all: $(TARGET)
run: $(TARGET)
./$(TARGET)
.PHONY: clean
clean:
rm -f $(TARGET) *.o
e. Run make and then the output binary named vector_add:
make
./vector_add
Expected output:
A block of even numbers are listed, indicating the result of adding two vectors: [ 1..N] + [1..N].
./vector_add
2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32
34 36 38 40 42 44 46 48 50 52 54 56 58 60 62 64
66 68 70 72 74 76 78 80 82 84 86 88 90 92 94 96
98 100 102 104 106 108 110 112 114 116 118 120 122 124 126 128
130 132 134 136 138 140 142 144 146 148 150 152 154 156 158 160
162 164 166 168 170 172 174 176 178 180 182 184 186 188 190 192
194 196 198 200 202 204 206 208 210 212 214 216 218 220 222 224
226 228 230 232 234 236 238 240 242 244 246 248 250 252 254 256
144
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
258 260 262 264 266 268 270 272 274 276 278 280 282 284 286 288
290 292 294 296 298 300 302 304 306 308 310 312 314 316 318 320
322 324 326 328 330 332 334 336 338 340 342 344 346 348 350 352
354 356 358 360 362 364 366 368 370 372 374 376 378 380 382 384
386 388 390 392 394 396 398 400 402 404 406 408 410 412 414 416
418 420 422 424 426 428 430 432 434 436 438 440 442 444 446 448
450 452 454 456 458 460 462 464 466 468 470 472 474 476 478 480
482 484 486 488 490 492 494 496 498 500 502 504 506 508 510 512
Troubleshooting
The Makefile from step 9.d contains tabs and may not copy well to your system, giving this error:
Hardware Prerequisites
You have a robot and a gamepad.
This example uses a Logitech* F710 wireless gamepad and the UP Xtreme i11 Robotic Kit.
1. Add ros-base-teleop to your robot’s yml file.
EI for AMR contains a yml file with ros-base-teleop configured:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
gedit 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml
Copy the lines for ros-base-teleop into your generic-robot yml file:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
meld 01_docker_sdk_env/docker_compose/05_tutorials/
aaeon_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml 01_docker_sdk_env/
docker_compose/05_tutorials/
generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml up
# replace 01_docker_sdk_env/docker_compose/05_tutorials/
generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2.tutorial.yml up with your yml file
2. Insert the USB dongle in the robot.
3. Place the robot in an area with multiple objects in it.
4. Go to the installation folder of Edge_Insights_for_Autonomous_Mobile_Robots, and prepare the
environment:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=27
sudo chmod a+rw /dev/input/js0
sudo chmod a+rw /dev/input/event*
5. Start mapping the area:
docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml up
# replace 01_docker_sdk_env/docker_compose/05_tutorials/
generic_robot_wandering__aaeon_realsense_collab_slam_fm_nav2_ukf.tutorial.yml up with your yml
file
Expected result: The robot starts wandering around the room and mapping the entire area.
145
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
6. After the robot starts to move, you can control it with the gamepad:
• To control the robot, keep the top right button labeled 1 in the picture below pressed at all times.
• To move the robot on the X and Y axes, enable the mode button, and use the buttons labeled 2 in
the picture below.
• To rotate the robot in place, use the joystick labeled 3 in the picture below.
• To move the robot on the X and Y axrs, disable the mode button, and use the joystick labeled 4 in
the picture below.
Hardware Prerequisites
You have a robot and a keyboard or an ssh/vnc connection to the robot.
This example uses the UP Xtreme i11 Robotic Kit.
1. Connect to your robot via ssh/vnc or direct access. If you choose direct access, insert a monitor and a
keyboard into the robot’s compute system.
2. Start your robot’s node, and make sure that you have the correct remapping similar to this:
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
chmod a+x run_interactive_docker.sh
./run_interactive_docker.sh amr-aaeon-amr-interface:2022.3 eiforamr -c aaeon_node
source /home/eiforamr/workspace/ros_entrypoint.sh
146
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
export ROS_DOMAIN_ID=167
ros2 run ros2_amr_interface amr_interface_node --ros-args -p try_reconnect:=true -p
timeout_connection:=1000.0 -p publishTF:=true --remap /amr/cmd_vel:=/cmd_vel -p port_name:=/dev/
ttyUSB0
3. In another terminal, open full-sdk, and start teleop_twist_keyboard:
NOTE The full-sdk docker image is only present in the Robot Complete Kit, not in the Robot Base Kit
or Up Xtreeme i11 Robotic Kit.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/01_amr/amr-sdk.all.yml
run full-sdk bash
source /home/eiforamr/workspace/ros_entrypoint.sh
export ROS_DOMAIN_ID=167
ros2 run teleop_twist_keyboard teleop_twist_keyboard
Expected result: The robot responds to your keyboard commands in these ways:
• i: Move forward
• k: Stop
• ,: Move backward
• j: Turn right
• l: Turn left
• q/z: Increase/decrease max speeds by 10%
• w/x: Increase/decrease only linear speed by 10%
• e/c: Increase/decrease only angular speed by 10%
• L or J (only for omnidirectional robots): Strafe (move sideways)
• anything else: Stop
• Ctrl-c: Quit
Simulation
The following tutorials tell you how to use ROS 2 simulations inside Docker* containers. Robot sensing and
navigation can be tested in these simulated environments.
turtlesim
147
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Complete Kit (see Get Started Guide
for Robots Step 3).
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=12
NOTE The “Access control disabled, clients cannot connect from any host.” message is expected and
does not impact functionality.
5. Run an automated yml file that opens a ROS 2 sample application inside the EI for AMR Docker*
container.
CHOOSE_USER=eiforamr docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
turtlesim.tutorial.yml up
6. Go to Plugins > Services > Service Caller: Choose to move turtle1 by choosing (from the Service
drop-down list) /turtle1/teleport_absolute and make sure you changed x and y coordinates for the
original values. Press Call. The turtle should move. Close the service caller window by pressing x. Then
type Ctrl-c.
148
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
149
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
This tutorial tells you how to use an industrial simulation room model (the OSRF GEAR workcell that was
used for the 2018 ARIAC competition) with objects and a wafflebot3 robot for simulation in Gazebo*. The
industrial room includes: shelves, conveyor belts, pallets, boxes, robots, stairs, ground lane markers, and a
tiled boundary wall.
NOTE If one or more of the images are not installed, continuing with these steps triggers a build
that takes longer than an hour (sometimes, a lot longer depending on the system resources and
internet connection).
3. If one or more of the images are not installed, Intel recommends installing the Robot Complete Kit with
the Get Started Guide for Robots.
4. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
5. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=32
6. Run the command below to start the Docker* container:
150
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
151
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Troubleshooting
If the robot is not moving but Gazebo* is started, start the Wandering application manually by opening a
container shell and entering:
The ARIAC world tutorial works only with the eiforamr user. If the yml is started with a different user, the
Gazebo* model fails.
This tutorial shows a TurtleBot* 3 simulated in a waffle world. For more information about TurtleBot* 3 and
the waffle world, see this.
NOTE If one or more of the images are not installed, continuing with these steps triggers a build
that takes longer than an hour (sometimes, a lot longer depending on the system resources and
internet connection).
3. If one or more of the images are not installed, Intel recommends installing the Robot Complete Kit with
the Get Started Guide for Robots.
4. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
5. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=32
6. Run this command to start the Docker* container:
152
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
153
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Troubleshooting
If the robot is not moving but Gazebo* has started, start the wandering application manually by opening a
container shell and entering:
Run the profiling application in a Docker* container with the VTune™ profiler.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Complete Kit with the Get Started
Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Prepare the environment setup:
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=19
5. Run the VTune™ profiler:
154
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Expected output:
vtune: Warning: To profile kernel modules during the session, make sure they are available in
the /lib/modules/kernel_version/ location.
vtune: Collection started. To stop the collection, either press CTRL-C or enter from another
console window: vtune -r /tmp/matrix_multiply_vtune/r001gh -command stop.
Address of buf1 = 0x7f4578e4b010
Offset of buf1 = 0x7f4578e4b180
Address of buf2 = 0x7f457864a010
Offset of buf2 = 0x7f457864a1c0
Address of buf3 = 0x7f45746e2010
Offset of buf3 = 0x7f45746e2100
Address of buf4 = 0x7f4573ee1010
Offset of buf4 = 0x7f4573ee1140
Using multiply kernel: multiply1
Running on Intel(R) Iris(R) Xe Graphics [0x9a49]
Elapsed Time: 0.91916s
vtune: Collection stopped.
vtune: Using result path `/tmp/matrix_multiply_vtune/r001gh'
vtune: Executing actions 19 % Resolving information for `libpi_opencl.so'
vtune: Warning: Cannot locate debugging information for file `/usr/local/lib/
libze_intel_gpu.so.1'.
vtune: Executing actions 20 % Resolving information for `libc-dynamic.so'
vtune: Warning: Cannot locate debugging information for file `/lib/modules/5.10.65/kernel/fs/
overlayfs/overlay.ko'.
vtune: Executing actions 20 % Resolving information for `libm-2.31.so'
vtune: Warning: Cannot locate debugging information for file `/usr/lib/x86_64-linux-gnu/
libm-2.31.so'.
vtune: Executing actions 20 % Resolving information for `libc-2.31.so'
vtune: Warning: Cannot locate debugging information for file `/usr/lib/x86_64-linux-gnu/
libc-2.31.so'.
vtune: Executing actions 20 % Resolving information for `ld-2.31.so'
vtune: Warning: Cannot locate debugging information for file `/usr/lib/x86_64-linux-gnu/
ld-2.31.so'.
vtune: Warning: Cannot locate file `vmlinux'.
vtune: Executing actions 20 % Resolving information for `libpin3dwarf.so'
vtune: Warning: Cannot locate debugging information for file `/usr/local/lib/libigc.so.1.0.8517'.
vtune: Executing actions 20 % Resolving information for `libxed.so'
vtune: Warning: Cannot locate debugging information for the Linux kernel. Source-level analysis
is not possible. Function-level analysis is limited to kernel symbol tables. See the Enabling
Linux Kernel Analysis topic in the product online help for instructions.
vtune: Executing actions 21 % Resolving information for `libgcc_s.so.1'
vtune: Warning: Cannot locate debugging information for file `/usr/lib/x86_64-linux-gnu/
libgcc_s.so.1'.
vtune: Executing actions 21 % Resolving information for `libstdc++.so.6.0.28'
vtune: Warning: Cannot locate debugging information for file `/usr/lib/x86_64-linux-gnu/libstdc+
+.so.6.0.28'.
vtune: Executing actions 21 % Resolving information for `libtpsstool.so'
vtune: Warning: Cannot locate debugging information for file `/opt/intel/oneapi/vtune/2022.0.0/
lib64/libtpsstool.so'.
vtune: Executing actions 21 % Resolving information for `i915.ko'
vtune: Warning: Cannot locate debugging information for file `/opt/intel/oneapi/vtune/2022.0.0/
lib64/runtime/libittnotify_collector.so'.
vtune: Warning: Cannot locate debugging information for file `/opt/intel/oneapi/vtune/2022.0.0/
lib64/runtime/libittnotify_collector.so'.
vtune: Executing actions 22 % Resolving information for `libOpenCL.so.1'
vtune: Warning: Cannot locate debugging information for file `/usr/local/lib/
libze_intel_gpu.so.1.2.20939'.
vtune: Executing actions 22 % Resolving information for `libigdrcl.so'
155
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
156
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
SVM Capabilities
Troubleshooting
For general robot issues, go to: Troubleshooting for Robot Tutorials.
This tutorial tells you how to run the benchmark application on an 11th Generation Intel® Core™ processor
with an integrated GPU. It uses the asynchronous mode to estimate deep learning inference engine
performance and latency.
NOTE If the image is not installed, continuing with these steps triggers a build that takes longer
than an hour (sometimes, a lot longer depending on the system resources and internet connection).
2. If the image is not installed, Intel recommends installing the Robot Complete Kit with the Get Started
Guide for Robots.
3. Go to the AMR_containers folder:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_containers
4. Start the Docker* container as root:
source /opt/intel/openvino/bin/setupvars.sh
--or--
source <OPENVINO_INSTALL_DIR>/bin/setupvars.sh
cd /opt/intel/openvino/inference_engine/samples/cpp
./build_samples.sh
157
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
2. Once the build is successful, access the benchmark application in the following directory:
cd /root/inference_engine_cpp_samples_build/intel64/Release
-- or --
cd <INSTALL_DIR>/inference_engine_cpp_samples_build/intel64/Release
The benchmark_app application is available inside the Release folder.
Input File
Select an image file or a sample video file to provide an input to the benchmark application from the
following directory:
cd /root/inference_engine_cpp_samples_build/intel64/Release
./benchmark_app [OPTION]
In this tutorial, we recommend you select the following options:
where:
<model>-------------The complete path to the model .xml file
<input>-------------The path to the folder containing image or sample video file.
<device>------------The device type can be GPU or CPU etc.,
<num_reqs>----------No of parallel inference requests
<num_threads>-------No of threads to use for inference on the CPU (throughput mode)
<batch>-------------Batch size
For complete details on the available options, run the following command:
./benchmark_app -h
158
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Build .................. 2021.2.0-1877-176bdf51370-releases/2021/2
Description ....... API
[ INFO ] Device info:
GPU
clDNNPlugin version ......... 2.1
Build ........... 2021.2.0-1877-176bdf51370-releases/2021/2
Benchmark Report
Sample execution results using an 11th Gen Intel® Core™ i7-1185GRE @ 2.80 GHz.
159
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE Performance results are based on testing as of dates shown in configurations and may not
reflect all publicly available updates. No product or component can be absolutely secure. Performance
varies by use, configuration and other factors. Learn more at Intel® Performance Index.
Troubleshooting
For general robot issues, go to: Troubleshooting for Robot Tutorials.
Run the Edge Insights for Autonomous Mobile Robots container on a KVM guest.
NOTE If the output is not greater than zero, reboot your server. Modify BIOS Settings > Security
features > Enable Virtualization Technology.
2. Verify that Kernel-based Virtual Machine (KVM) acceleration can be used with the commands:
NOTE
1. If the PAM configuration page is displayed when installing the following packages, click Yes.
2. If Configuring openssh-server page is displayed when installing the following packages, select
keep the local version currently installed.
160
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
vim br_kvm.xml
b. Add bridge details to br_kvm.xml:
<network>
<name>br_kvm</name>
<forward mode='nat'>
<nat>
<port start='1024' end='65535'/>
</nat>
</forward>
<bridge name='br10' stp='on' delay='0'/>
<ip address='192.168.124.1' netmask='255.255.255.0'>
<dhcp>
<range start='192.168.124.50' end='192.168.124.200'/>
</dhcp>
</ip>
</network>
c. Define and start br_kvm network using the following commands:
161
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
c. Select the CPUs and Memory. For example: Memory: 4096, CPUs: 2
d. Add minimum 100 GB for the virtual machine storage.
162
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
163
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
8. Install Edge Insights for Autonomous Mobile Robots using the steps in Get Started Guide for Robots.
NOTE If your system is behind a proxy, you must configure the proxy settings.
9. Run the Intel® RealSense™ ROS 2 sample application inside the Docker* container using the steps from
Intel® RealSense™ ROS 2 Sample Application.
164
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Troubleshooting
If the following error is encountered:
This tutorial covers how to build, install, and run Wireless Wide Area Network (WWAN) 5G private network
with Linux* components on Ubuntu for Intel IoT platforms.
Prerequisites:
• Fibocom’s FM350 5G module installed on the compute system
• A 5G private network infrastructure
• The APN of the 5G private network
sudo apt-get -y install build-essential gcc bc bison flex libssl-dev libncurses5-dev libelf-dev
dwarves zstd
165
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
make olddefconfig
scripts/config --set-str SYSTEM_TRUSTED_KEYS ""
scripts/config --set-str CONFIG_SYSTEM_REVOCATION_KEYS ""
scripts/config --enable WWAN
scripts/config --module CONFIG_MTK_T7XX
6. Compile the kernel, and create the Debian* kernel packages:
NOTE
cd /tmp
sudo update-grub
9. Reboot your system.
sync
sudo reboot -fn
10. Check your kernel version after reboot.
uname -r
cd Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
sudo chmod 775 ./01_docker_sdk_env/artifacts/01_amr/amr_5G_wwan/wwan_module_install.sh
2. If you already have it cloned, remove the focal/ folder:
rm -rf focal/
3. Run the kernel module install script:
./01_docker_sdk_env/artifacts/01_amr/amr_5G_wwan/wwan_module_install.sh
4. Follow the on-screen instructions to fetch the kernel source files and build and load the kernel modules.
(The patches are self-healing, so errors that appear in the patches are fixed by subsequent patches.)
5. Prepare the environment setup:
source 01_docker_sdk_env/docker_compose/common/docker_compose.source
166
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
If you get a “some variables not defined” error message, see Troubleshooting.
6. Run the WWAN 5G module container:
./network_init.sh
9. Follow the on-screen instructions to enter the APN and IP route of your 5G private network.
10. Test the 5G network connection by pinging the IP address of the server.
11. Disconnect the 5G network connection and disable the 5G module from within the WWAN 5G module
container.
a. Obtain the 5G module index number:
mmcli -L
b. Disconnect the 5G network connection and disable the 5G module (this example uses 0 for the 5G
module index number):
mmcli -m 0 --simple-disconnect
mmcli -m 0 -d
NOTE If you only stop the WWAN 5G module container, the 5G network is still available because the
5G module is still enabled and has an active connection.
Troubleshooting
• If you see an error message that docker-compose fails with some variables not defined, add the
environment variables to .bashrc so that they are available to all terminals:
export DOCKER_BUILDKIT=1
export COMPOSE_DOCKER_CLI_BUILD=1
export DOCKER_HOSTNAME=$(hostname)
export DOCKER_USER_ID=$(id -u)
export DOCKER_GROUP_ID=$(id -g)
export DOCKER_USER=$(whoami)
# Check with command
env | grep DOCKER
• During execution of wwan_module_install.sh, the following errors may be encountered:
167
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Change Existing and Add New Docker* Images to the EI for AMR SDK
NOTE Building should be done on a development machine with at least 16 GB of RAM. Building
multiple Docker* images, in parallel, on a system with 8 GB of RAM may end in a crash due to lack of
system resources.
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
01_docker_sdk_env/
There are two main Docker* files here:
• dockerfile.amr: For images to be deployed on robots
• dockerfile.edge-server: For images to be deployed on the server
These Docker* files create multiple Docker* images. The Docker* images to be created are defined in
the yaml configuration file, which is in the docker_compose folder.
Also, these Docker* files include many sub-files in the docker_stages folder. Each Docker* stage
represents a specific component that must be included in one of the Docker* images.
cd docker_stages
3. Open the Docker* file from the environment folder in your preferred integrated development
environment (IDE), and append component-specific installation instructions in the appropriate place.
The following is an example of appending the Gazebo* application in dockerfile.stage.realsense.
168
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Give appropriate permissions:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_202*/AMR_containers
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
docker-compose -f 01_docker_sdk_env/docker_compose/01_amr/amr-sdk.all.yml build realsense
NOTE Building should be done on a development machine with at least 16 GB of RAM. Building
multiple Docker* images, in parallel, on a system with 8 GB of RAM may end in a crash due to lack of
system resources.
Building on the People’s Republic of China (PRC) network may result in multiple issues. See the
Troubleshooting section for more details.
Create New Docker* Images with Selected Applications from the SDK
In this tutorial, you install an imaginary component as a new image and add it to the existing ros2-foxy-
sdk image.
1. Add a new file called docker_stages/01_amr/dockerfile.stage.imaginary.
Add instructions to install this component into this file using basic Docker* file syntax:
# Note: below repo does not exist, it is for demonstration purposes only
WORKDIR ${ROS2_WS}
RUN cd src \
&& git clone --branch ros2 https://github.com/imaginary.git \
&& cd imaginary && git checkout <commit_id> \
&& source ${ROS_INSTALL_DIR}/setup.bash \
&& colcon build --install-base ${ROS2_WS}/install \
&& rm -rf ${ROS2_WS}/build/* ${ROS2_WS}/src/* ${ROS2_WS}/log/*
169
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
INCLUDE+ docker_stages/01_amr/dockerfile.stage.vda5050
INCLUDE+ docker_stages/01_amr/dockerfile.stage.imaginary
INCLUDE+ docker_stages/01_amr/dockerfile.stage.opencv
INCLUDE+ docker_stages/01_amr/dockerfile.stage.rtabmap
INCLUDE+ docker_stages/01_amr/dockerfile.stage.fastmapping
INCLUDE+ docker_stages/01_amr/dockerfile.stage.gazebo
INCLUDE+ docker_stages/01_amr/dockerfile.stage.gstreamer
INCLUDE+ docker_stages/01_amr/dockerfile.stage.kobuki
INCLUDE+ docker_stages/01_amr/dockerfile.stage.nav2
INCLUDE+ docker_stages/01_amr/dockerfile.stage.realsense
INCLUDE+ docker_stages/01_amr/dockerfile.stage.ros-arduino
INCLUDE+ docker_stages/01_amr/dockerfile.stage.ros1-bridge
INCLUDE+ docker_stages/01_amr/dockerfile.stage.rplidar
INCLUDE+ docker_stages/01_amr/dockerfile.stage.turtlebot3
INCLUDE+ docker_stages/01_amr/dockerfile.stage.turtlesim
# simlautions has hard dependency in nav2 (@todo:), so we can not create separate image for
simulations without nav2.
INCLUDE+ docker_stages/01_amr/dockerfile.stage.simulations
INCLUDE+ docker_stages/01_amr/dockerfile.stage.entrypoint
################################# ros2-foxy-SDK stage END #######################################
4. Define a new target in the docker_compose/01_amr/amr-sdk.all.yml or docker_compose/
01_amr/edge-server.all.yml file:
imaginary:
image: ${REPO_URL}amr-ubuntu2004-ros2-foxy-imaginary:${DOCKER_TAG:-latest}
container_name: ${CONTAINER_NAME_PREFIX:-amr-sdk-}imaginary
extends:
file: ./amr-sdk.all.yml
service: ros-base
build:
target: imaginary
network_mode: host
command: ['echo imaginary run finished.']
5. Build two Docker* images:
• amr-ubuntu2004-ros2-foxy-imaginary
• amr-ubuntu2004-ros2-foxy-sdk
170
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
These images contain the new imaginary component.
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_202*/AMR_containers
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
docker-compose -f 01_docker_sdk_env/docker_compose/01_amr/amr-sdk.all.yml build imaginary ros2-
foxy-sdk
6. To see the details of the built image:
Troubleshooting
1. Building on the People’s Republic of China (PRC) Open Network.
Building Docker* images on the People’s Republic of China (PRC) open network may fail. Intel
recommends updating these links with their corresponding PRC mirrors. To do this, go to the
AMR_containers folder, and update the broken sites with the default or user-defined mirrors.
cd Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
chmod 775 changesources.sh
./changesources.sh -d .
Enter mirror server ('https://example.com' format) or leave empty to use the default value.
Git mirror [https://github.com.cnpmjs.org]:
Apt mirror [http://mirrors.bfsu.edu.cn]:
Pip mirror [https://opentuna.cn/pypi/web/simple/]:
Raw files mirror [https://raw.staticdn.net]:
2. Building on a limited resource system (8 GB of RAM or less) can be problematic.
Perform the following steps to minimize issues:
a. Save the output in a file instead of printing it, because printing consumes RAM resources.
NOTE Edge Insights for Autonomous Mobile Robots comes with prebuilt images and building all
images is not required. Only do this step if you want to regenerate all images.
b. For remote connections, use an ssh connection instead of a VNC one as VNC connection consumes
resources.
c. For building multiple Docker* images, do not use the --parallel option as it requires more
resources.
NOTE Edge Insights for Autonomous Mobile Robots comes with prebuilt images and building all
images is not required. Only do this step if you want to regenerate all images.
171
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
cd <edge_insights_for_amr_path>Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers/
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=35
docker-compose -f 01_docker_sdk_env/docker_compose/01_amr/amr-sdk.all.yml run --name full-
flavour-sdk full-sdk bash
2. In a different terminal, attach to this Docker* image:
export ROS_DOMAIN_ID=35
wget <ROS 1 BAG>
ros2 bag play -s rosbag_v2 <ROS 1 BAG>
echo $DISPLAY
If this variable is empty, it causes issues when opening applications that need a GUI.
The most common solution is to give it the 0:0 value:
If it is not, find out the value of DISPLAY set by vncserver and then set the correct value:
For example:
ps ax |grep vncserver
/usr/bin/Xtigervnc :42 -desktop ....
/usr/bin/perl /usr/bin/vncserver -localhost no -geometry 1920x1000 -depth 24 :42
172
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Use ROS_DOMAIN_ID to Avoid Interference in ROS Messages
A typical method to demonstrate a use case requires you to start a container (or group of containers) and
exchange ROS messages between various ROS nodes. However, interference from other ROS nodes can
disrupt the whole process. For example, you might receive ROS messages from unknown nodes that are not
intended for the demo use case. These other nodes could be on the same host machine or on other host
machines within the local network. In this scenario, it can be difficult to debug and resolve the interference.
You can avoid this by declaring ROS_DOMAIN_ID as a fixed numeric value per use case, under the following
conditions:
• The ROS_DOMAIN_ID should be same for all containers launched for a particular use case.
• The ROS_DOMAIN_ID should be an integer between 0 and 101.
• After launching the container, you can declare it with:
export ROS_DOMAIN_ID=<value>
For more information, go to: ROS_DOMAIN_ID
To add the ROS_DOMAIN_ID, you can choose any of the below options.
# In file 01_docker_sdk_env/docker_compose/common/common.yml
# ROS_DOMAIN_ID can be added that applies to all use cases
services:
common:
environment:
ROS_DOMAIN_ID: <choose ID>
2. Add it in the .env file for all containers:
# In file 01_docker_sdk_env/docker_compose/01_amr/.env
# add below line and provide ROS_DOMAIN_ID
ROS_DOMAIN_ID=<choose ID>
3. Add it in the specific yml file for a specific use case for specific targets:
services:
ros-base:
image: ${REPO_URL}amr-ubuntu2004-ros2-foxy-ros-base:${DOCKER_TAG:-latest}
container_name: ${CONTAINER_NAME_PREFIX:-amr-sdk-}ros-base
environment:
ROS_DOMAIN_ID: <choose ID>
env_file:
- ./.env
extends:
4. Add it in the specific yml file in the command: section and apply only after launching the containers:
# In file 01_docker_sdk_env/docker_compose/05_tutorials/
fleet_mngmnt_with_low_battery.up.tutorial.yml
# In the below example, ROS_DOMAIN_ID is set to 58
# You may change it to any new value as per use case requirement.
services:
battery_bridge:
image: ${REPO_URL}amr-ubuntu2004-ros2-foxy-battery_bridge:${DOCKER_TAG:-latest}
container_name: ${CONTAINER_NAME_PREFIX:-amr-sdk-}battery_bridge
extends:
file: ../01_amr/amr-sdk.all.yml
173
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
service: ros-base
volumes:
- /dev/battery_bridge:/dev/battery_bridge:rw
build:
target: battery_bridge
network_mode: host
restart: "no"
command:
- |
source ros_entrypoint.sh
source battery-bridge/src/prebuilt_battery_bridge/local_setup.bash
export ROS_DOMAIN_ID=58
sleep 5
ros2 run battery_pkg battery_bridge
5. Add it while running a container using the run_interactive_docker.sh script:
NOTE You can use any number between 0 and 101 (inclusive), to set ROS_DOMAIN_ID, as long as it is
not used by a different ROS system.
Be aware that you can also use these options to modify other environment variables.
docker: Error response from daemon: error while creating mount source path '/nfs/site/home/
<user>': mkdir /nfs/site/home/<user>: file exists.
To avoid this, before you run a Docker* image, create a new directory in /tmp (or any locally mounted
volume), and set $HOME to the new path:
mkdir /tmp/tmp_home
export HOME=/tmp/tmp_home
./run_interactive_docker.sh eiforamr-full-flavour-sdk:<release_tag> eiforamr
174
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• The Intel® Smart Edge Open control plane which deploys ThingsBoard* to the edge node (The
ThingsBoard* GUI is accessed with the control plane IP and mapped port.)
NOTE In a Single-Node deployment, ThingsBoard* is installed on the same machine as the control
plane.
In a Multi-Node deployment, ThingsBoard* is installed on an edge node, not the control plane.
NOTE The diagram only shows two robots but you can add as many as you need.
• The FDO server which executes the manufacturer, rendezvous and owner servers
• edge-server-fdo-manufacturer on terminal 1
• edge-server-fdo-owner on terminal 2
• edge-server-fdo-rendezvous on terminal 3
• terminal 4 for configuration and control
175
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE The FDO server can be on any machine in the same network as the control plane. In this
tutorial, the FDO server is on an edge node.
1. The FDO owner sends the FDO script, fileserver access, and filelist to the robot at field to be
onboarded.
2. The FDO client saves and starts the FDO script.
3. FDO loads and stores files from FileServer.
4. FDO registers the device in ThingsBoard* and and writes the Intel® In-Band Manageability
configuration.
a. FDO provisions each new device.
176
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Device naming convention:
BasicFleetManagement_tutorial_127.0.0.1_noop_00005E0053EA
b. FDO saves the Intel® In-Band Manageability configuration and certification files in the host file
system.
5. FDO registers the device in Intel® Smart Edge Open and gets the token and hash.
6. FDO starts the Intel® Smart Edge Open install script.
7. Intel® Smart Edge Open deploys all configured containers, including Intel® In-Band Manageability, and
brings them up.
8. When ThingsBoard* receives a new device online event, ThingsBoard* triggers a firmware and OS
update. After completion, the power recycles.
Prerequisites
You must do all sections of this tutorial in order.
Configure the edge with the Get Started Guide for Robot Orchestration.
Verify that the robot has a product name.
177
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
178
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
a. The entire HTTP URL with the .tar.gz file for the firmware file.
b. The Manufacturer, Vendor, and the Product name with the output of the following commands.
Execute these commands on the robot.
179
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
180
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE Updating the Manufacturer, Vendor, and Product name needs to be done every time you
onboard a new type of robot. If these values do not match the ones from the robot trying to onboard,
the flow fails.
Configure the Robot and the FDO Server for the Onboarding Flow
1. Robot and FDO server Download, and install the needed scripts from the latest release.
NOTE These steps only install certain modules (Docker Community Edition CE and for Docker
Compose) and the set of scripts needed for this onboarding tutorial. These steps do not install the
full Robot Complete Kit bundle on your Robot.
unzip edge_insights_for_amr.zip
cd edge_insights_for_amr
chmod 775 edgesoftware
export no_proxy="127.0.0.1/32,devtools.intel.com"
./edgesoftware download
./edgesoftware list
NOTE Get the IDs for the Docker* Community Edition CE and for Docker Compose*:
• Edit /etc/ssh/sshd_config:
PermitRootLogin yes
• Restart the ssh service:
181
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
2. FDO server All images in the FDO pipeline are self-contained and require minimal configuration.
Configuration settings are all handled by external environment files, but some environment files need to
be generated by running the fdo_keys_gen.sh script:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers/01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/
chmod +x fdo_keys_gen.sh
bash fdo_keys_gen.sh .
3. Robot Install the Battery Bridge Kernel Module.
cd components/amr_battery_bridge_kernel_module/src/
chmod a+x module_install.sh
# below command will install battery-bridge-kernel-module
sudo module_install.sh
# to uninstall battery-bridge-kernel-module (if needed)
sudo module_install.sh -u
The Battery Bridge Kernel Module does not work on Secure Boot machines. To disable UEFI Secure
Boot:
a. Go to the BIOS menu.
b. Open Boot > Secure Boot.
c. Disable Secure Boot.
d. Save the new configuration, and reboot the machine.
NOTE When the robot uses an actual battery, the sensor-driver of the robot provides the
corresponding driver’s ros-interface, which writes battery status into generic ros2-topic interface /
sensors/battery_state. However, this information is usually not transmitted to the generic OS
interface /sys/class/power_supply. Components that interact with the OS directly (for example, Intel®
In-Band Manageability), cannot get battery-information from the OS. To bridge this gap, a ROS
component battery-bridge and battery-bridge-kernel-module are provided. Using this battery-bridge,
battery-status can be transmitted via a kernel module to the standard OS interface /sys/class/
power_supply. The kobuki driver and kobuki_ros_interfaces is proven to work with battery-bridge and
battery-bridge-kernel-module components.
4. Robot Set the robot type by adding your robot type to /etc/robottype. The supported values are
amr-aaeon and amr-pengo. Example:
182
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
options \
http-proxy="${http_proxy}" --recv-key F6E65AC044F831AC80A06380C8B3A55A6F3EFCDE;
fi \
&& sudo add-apt-repository "deb https://librealsense.intel.com/Debian/apt-repo focal main" -
u \
&& DEBIAN_FRONTEND=noninteractive sudo apt-get install --no-install-recommends -q -
y \
rsync
\
librealsense2=2.50.* \
librealsense2-utils=2.50.* \
librealsense2-dev=2.50.* \
librealsense2-gl=2.50.* \
librealsense2-net=2.50.* \
librealsense2-dbg=2.50.* \
librealsense2-udev-rules=2.50.* \
&& sudo rm -rf /var/lib/apt/lists/*
sudo dpkg --configure -a
sudo mkdir -p /var/cache/manageability/repository-tool/sota
6. Robot Disable swap:
export DISPLAY=0:0
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
export no_proxy=<no_proxy>,ip_from_fdo_server,ip_from_robot,localhost
sudo su
source ./AMR_containers/01_docker_sdk_env/docker_compose/common/docker_compose.source
2. FDO server all terminals
export DISPLAY=0:0
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
source ./AMR_server_containers/01_docker_sdk_env/docker_compose/common/docker_compose.source
NOTE Set up the environment on every terminal on which you want to run docker-compose
commands.
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
nano 01_docker_sdk_env/artifacts/01_amr/amr_fdo/device.config
183
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
MANUFACTURER_IP_ADDRESS = ip_from_FDO_Server
c. For onboarding multiple robots, use a unique serial number for the DEVICE_SERIAL_NUMBER
variable.
This value must be unique for each robot that you onboard. Therefore, the default serial number,
1234abcd, can only be used once.
DEVICE_SERIAL_NUMBER = <unique_serial_number>
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/AMR_containers
docker-compose -f ./01_docker_sdk_env/docker_compose/01_amr/amr-sdk.all.yml build fdo-client
2. FDO server terminal 1 Build the FDO manufacturer server image:
Before building the FDO manufacturer image, there are a variety of configuration flags that need to be
adjusted.
a. Open 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/manufacturer/
service.yml:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers
nano 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/manufacturer/service.yml
b. Add the following lines:
# Modify the values shown below in bold in the above file with respective DNS and IP address of
Rendezvous server
rv-instruction:
dns: dns_from_step_4
ip: ip_from_FDO_Server
c. Build the manufacturer server image:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers
docker-compose -f ./01_docker_sdk_env/docker_compose/02_edge_server/edge-server.all.yml build
fdo-owner
4. FDO server terminal 3 Build the rendezvous server image:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers
docker-compose -f ./01_docker_sdk_env/docker_compose/02_edge_server/edge-server.all.yml build
fdo-rendezvous
See Troubleshooting if docker-compose errors are encountered.
Initialize FDO
1. FDO server - terminal 4 Adjust the Python script for your setup.
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers/
nano 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/scripts/sdo_script.py
184
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
a. For DEF_TB_MQTT_PORT, replace 1883 with 18883.
b. For network:
|1|pYOofp22FlwwWNHH+vaK8gWhSxw=|S713N4hkiSRJCzfJQgqMfaYTJWw= ecdsa-sha2-nistp256
AABBE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFv3xFkoWZuALLa/iH8fLBK5ciKkvep
+61DAGEBSiORQbPxUtvBo0qbi14/N+KD58YEkWrrzlQIEsp/minlSVKE=
With the output of the following command:
The values for device_key and device_secret are obtained from the ThingsBoard* web
interface. Go to Thingsboard > Device Profiles > Device Profiles details > Device Provisioning.
In preconfigured data, the following are set in ThingsBoard*:
device_key = "9oq7uxtdsgt4yjyqdekg"
device_secret = "6z3j3osphpr8ck1b9ocp"
e. seo
a. For host, replace xx.xxx.xx.xxx with the control plane IP.
b. For crt_hash, replace
fd6d98ee914f5e08df1858b2e82e1ebacbcf35cae0ddd7e146ec18fa200a265b with the
output of the following commands on control plane:
cd /etc/kubernetes/pki/
openssl x509 -pubkey -in ca.crt | openssl rsa -pubin -outform der 2>/dev/null | openssl dgst -
sha256 -hex | sed 's/^.* //'
f. sftp_filelist
a. In the fdo_sftp/etc/docker/certs.d line, replace 10.237.22.133 with the IP of the
control plane.
b. Add / at the beginning of every line after "file":".
185
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
{"file":"/fdo_sftp/root/.docker/config.json","path":"/host/root/.docker/" },\
{"file":"/fdo_sftp/etc/docker/daemon.json","path":"/host/etc/docker/" }, \
{"file":"/fdo_sftp/etc/docker/certs.d/<Replace here with Control Plane
IP>:30003/ca.crt","path":"/host/etc/docker/certs.d/<Replace here with Control Plane
IP>:30003" },\
{"file":"/fdo_sftp/etc/systemd/system/docker.service.d/http-
proxy.conf","path":"/host/etc/systemd/system/docker.service.d" },\
{"file":"/fdo_sftp/seo_install.sh","path":"/host/root" },\
{"file":"/fdo_sftp/k8s_apply_label.py","path":"/host/root" },\
{"file":"/fdo_sftp/etc/amr/ri-certs/server.pem","path":"/host/etc/amr/ri-
certs" },\
{"file":"/fdo_sftp/etc/amr/ri-certs/client.key","path":"/host/etc/amr/ri-
certs" },\
{"file":"/fdo_sftp/etc/amr/ri-certs/client.pem","path":"/host/etc/amr/ri-
certs" }]'
2. FDO server terminal 4 Edit 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/
scripts/multi_machine_config.sh:
nano 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/scripts/multi_machine_config.sh
a. Assign the value from 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/
creds/manufacturer/service.env to the variable mfg_api_passwd.
cat 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/creds/manufacturer/service.env
b. Assign the value from 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/
creds/owner/service.env to the variable default_onr_api_passwd.
cat 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/creds/owner/service.env
c. Replace {rv-dns} with the FDO server DNS.
d. Replace {owner-dns} with the FDO server DNS.
e. Replace {rv-ip} with the FDO server IP.
f. Replace {owner-ip} with the FDO server IP.
g. Replace the http://localhost:8042 and http://localhost:8039 in both curl commands with
http://FDO_SERVER_IP:8042 with http://FDO_SERVER_IP:8039.
Example (without the curly brackets):
mfg_api_passwd={manufacturer_api_password_from_service.env}
onr_api_passwd={owner_api_password_from_service.env}
.......................................................
# Updating RVInfo blob in Manufacturer
# Replace localhost, {rv-dns} and {rv-ip} references with respective DNS and IP address of the
host machine
curl -D - --digest -u "${api_user}":"${mfg_api_passwd}" --location --request POST 'http://
<ip_from_FDO_SERVER>:8039/api/v1/rvinfo' \
--header 'Content-Type: text/plain' \
--data-raw '[[[5,"dns"],[3,8040],[12,1],[2,"ip_from_FDO_SERVER"],[4,8040]]]'
nano 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/scripts/extend_upload.sh
186
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
a. Assign the value from 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/
creds/manufacturer/service.env to the variable default_mfg_api_passwd.
cat 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/creds/manufacturer/service.env
b. Assign the value from 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/
creds/owner/service.env to the variable owner_api_password_from_machine.
cat 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/creds/owner/service.env
c. Assign the FDO server IP to the variable default_mfg_ip.
d. Assign the FDO server IP to the variable default_onr_ip.
Example:
default_mfg_ip="<ip_from_FDO_SERVER>"
default_onr_ip="<ip_from_FDO_SERVER>"
...........................
default_mfg_api_passwd="<manufacturer_api_password_from_service.env>"
default_onr_api_passwd="<owner_api_password_from_service.env>"
4. FDO server terminal 3 Edit 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/
scripts/configure_serviceinfo.sh, and set the following variables:
nano 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/scripts/configure_serviceinfo.sh
a. Assign the FDO server IP to the variable OWNER_IP.
Onboard
FDO is a new IoT standard that is built on Intel® Secure Device Onboard (Intel® SDO) specifications. It is the
first step in onboarding a device. The FDO specification specifies four entities.
• Device: the EI for AMR device plus the FDO client (the FDO client supports the FDO protocol)
• Manufacturer Server: the entity that is responsible for the initial steps of the FDO protocol and loading
credentials onto the device, and is also a part of the production flow of the EI for AMR device
• Owner Server: the entity that sends all required data (for example, keys and certificates) to the device in
the final protocol step TO2
• Rendezvous Server: the first contact point for the device after you switch the device on and configure it
for network communication. The rendezvous server sends the device additional information, for example,
how to contact the owner server entity.
All containers, including the client, follow this command structure:
docker-compose -f <.yml path used during build stage> up <fdo service name>
1. FDO server terminal 1 Run the manufacturer server:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers/
docker-compose -f 01_docker_sdk_env/docker_compose/02_edge_server/edge-server.all.yml up fdo-
manufacturer
2. FDO server terminal 2 Run the owner server:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers/
docker-compose -f 01_docker_sdk_env/docker_compose/02_edge_server/edge-server.all.yml up fdo-
owner
3. FDO server terminal 3 In a new terminal window, run the rendezvous server:
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers/
docker-compose -f 01_docker_sdk_env/docker_compose/02_edge_server/edge-server.all.yml up fdo-
rendezvous
187
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_*/
AMR_server_containers/
sudo su
export no_proxy=<no_proxy>,ip_from_FDO_SERVER,ip_from_ROBOT,localhost
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
export ROS_DOMAIN_ID=17
CHOOSE_USER=root docker-compose -f 01_docker_sdk_env/docker_compose/05_tutorials/
fdo_client_onboard.yml up
After running the FDO client for the first time, the device initialization is complete:
NOTE When starting FDO containers, start the FDO client image last because the FDO client image
immediately begins reaching out to the manufacturer server in order to complete device initialization
(DI), and it only attempt this connection a few times before exiting. If the FDO client is successful in
connecting to the manufacturer server, the manufacturer server assigns a GUID to the FDO client and
generates an ownership voucher for use in the rest of the pipeline.
cd 01_docker_sdk_env/artifacts/02_edge_server/edge_server_fdo/scripts/
chmod +x *
sudo su
export no_proxy=<no_proxy>,ip_from_FDO_SERVER,ip_from_Robot,localhost
./multi_machine_config.sh
188
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Expected output:
HTTP/1.1 401
WWW-Authenticate: Digest realm="Authentication required", qop="auth",
nonce="1652260953609:a1f80c513623b4c7b87292c054d5d650", opaque="4F6AB1DF45A94C67D59892BC7DB6B6B4"
Content-Type: text/html;charset=utf-8
Content-Language: en
Content-Length: 673
Date: Wed, 11 May 2022 09:22:33 GMT
HTTP/1.1 200
Content-Length: 0
Date: Wed, 11 May 2022 09:22:33 GMT
HTTP/1.1 401
WWW-Authenticate: Digest realm="Authentication required", qop="auth",
nonce="1652260953705:0e2856e16da3eb830dca777a34f1f154", opaque="E11DE6169652A5495FC93933790D1A04"
Content-Type: text/html;charset=utf-8
Content-Language: en
Content-Length: 673
Date: Wed, 11 May 2022 09:22:33 GMT
HTTP/1.1 200
Content-Length: 0
Date: Wed, 11 May 2022 09:22:33 GMT
7. FDO server terminal 4 Run the configure_serviceinfo.sh:
./configure_serviceinfo.sh
Expected output:
HTTP/1.1 100
HTTP/1.1 200
Content-Length: 0
Date: Thu, 19 May 2022 06:19:05 GMT
8. FDO server terminal 4 Add the robot by using the serial number.
./extend_upload.sh -s <serial_number>
# By default the serial number is 1234abcd, the exepcted output is assuming this serial number.
./extend_upload.sh -s 1234abcd # use your robot's serial number.
NOTE The serial number is the value of DEVICE_SERIAL_NUMBER from the 01_docker_sdk_env/
artifacts/01_amr/amr_fdo/device.config file, set on the robot when preparing to build the FDO server
in Prepare the Environment Needed to Build the FDO Docker* Images.
189
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Expected output:
10. Robot
NOTE FDO protocol steps TO1 and TO2 can take more than five minutes.
Expected result:
Control plane In the ThingsBoard* GUI, Robot was added in Devices as a new device.
190
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE The device is online on the Dashboard after the Intel® In-Band Manageability container in Robot
is automatically brought up successfully.
191
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Robot The wandering app is deployed from the Intel® Smart Edge Open controller, and the robot starts
to wander around.
12. Verify that the onboarding was successful by checking the followings logs on the control plane:
192
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
pod/onboarding-deployment-95f5dc897-j6t5j 0/16 Pending 0 61m
<none> <none> <none> <none>
pod/onboarding-deployment-95f5dc897-qd22f 16/16 Running 38 (4m15s ago) 61m
10.245.224.68 glaic3ehlaaeon2 <none> <none>
IMAGES
SELECTOR
deployment.apps/onboarding-deployment 0/5 5 0 61m dds-bridge,amr-
fleet-management,vda5050-ros2-bridge,amr-realsense,amr-ros-base-camera-tf,amr-aaeon-amr-
interface,amr-ros-base-teleop,amr-battery-bridge,amr-object-detection,imu-madgwick-filter,robot-
localization,amr-collab-slam,amr-fastmapping,amr-nav2,amr-wandering,amr-vda-navigator
10.237.22.198:30003/intel/eclipse/zenoh-bridge-dds:0.5.0-beta.9,10.237.22.198:30003/intel/amr-
fleet-management:2022.3,10.237.22.198:30003/intel/amr-vda5050-ros2-
bridge:2022.3,10.237.22.198:30003/intel/amr-realsense:2022.3,10.237.22.198:30003/intel/amr-ros-
base-camera-tf:2022.3,10.237.22.198:30003/intel/amr-aaeon-amr-
interface:2022.3,10.237.22.198:30003/intel/amr-ros-base-teleop:2022.3,10.237.22.198:30003/intel/
amr-battery-bridge:2022.3,10.237.22.198:30003/intel/amr-object-
detection:2022.3,10.237.22.198:30003/intel/amr-imu-madgwick-filter:2022.3,10.237.22.198:30003/
intel/amr-robot-localization:2022.3,10.237.22.198:30003/intel/amr-collab-
slam:2022.3,10.237.22.198:30003/intel/amr-fastmapping:2022.3,10.237.22.198:30003/intel/amr-
nav2:2022.3,10.237.22.198:30003/intel/amr-wandering:2022.3,10.237.22.198:30003/intel/amr-vda-
navigator:2022.3 app.kubernetes.io/instance=onboarding-abcxzy,app.kubernetes.io/name=onboarding
IMAGES
SELECTOR
replicaset.apps/onboarding-deployment-95f5dc897 5 5 0 61m dds-
bridge,amr-fleet-management,vda5050-ros2-bridge,amr-realsense,amr-ros-base-camera-tf,amr-aaeon-
amr-interface,amr-ros-base-teleop,amr-battery-bridge,amr-object-detection,imu-madgwick-
filter,robot-localization,amr-collab-slam,amr-fastmapping,amr-nav2,amr-wandering,amr-vda-
navigator 10.237.22.198:30003/intel/eclipse/zenoh-bridge-dds:0.5.0-beta.9,10.237.22.198:30003/
intel/amr-fleet-management:2022.3,10.237.22.198:30003/intel/amr-vda5050-ros2-
bridge:2022.3,10.237.22.198:30003/intel/amr-realsense:2022.3,10.237.22.198:30003/intel/amr-ros-
193
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
base-camera-tf:2022.3,10.237.22.198:30003/intel/amr-aaeon-amr-
interface:2022.3,10.237.22.198:30003/intel/amr-ros-base-teleop:2022.3,10.237.22.198:30003/intel/
amr-battery-bridge:2022.3,10.237.22.198:30003/intel/amr-object-
detection:2022.3,10.237.22.198:30003/intel/amr-imu-madgwick-filter:2022.3,10.237.22.198:30003/
intel/amr-robot-localization:2022.3,10.237.22.198:30003/intel/amr-collab-
slam:2022.3,10.237.22.198:30003/intel/amr-fastmapping:2022.3,10.237.22.198:30003/intel/amr-
nav2:2022.3,10.237.22.198:30003/intel/amr-wandering:2022.3,10.237.22.198:30003/intel/amr-vda-
navigator:2022.3 app.kubernetes.io/instance=onboarding-abcxzy,app.kubernetes.io/
name=onboarding,pod-template-hash=95f5dc897
For amr-pengo, run:
$ docker images
<Control_Plane_IP>:30003/intel/amr-ros-base-camera-tf latest
31735754089b 2 days ago 8.25GB
<Control_Plane_IP>:30003/intel/amr-wandering latest
31735754089b 2 days ago 8.25GB
<Control_Plane_IP>:30003/intel/amr-fastmapping latest
5c1bbefc1d17 2 days ago 2.28GB
<Control_Plane_IP>:30003/intel/amr-collab-slam latest
415975276b1f 2 days ago 3.24GB
<Control_Plane_IP>:30003/intel/amr-aaeon-amr-interface latest
5d94f57da0d1 2 days ago 2.37GB
<Control_Plane_IP>:30003/intel/amr-realsense latest
1dab67f4d287 2 days ago 3GB
<Control_Plane_IP>:30003/intel/amr-ros-base-camera-tf latest
0ac635f5633f 2 days ago 1.76GB
<Control_Plane_IP>:30003/intel/amr-nav2 latest
769353e041bf 2 days ago 3.55GB
<Control_Plane_IP>:30003/intel/amr-kobuki latest
799ed6f79385 2 days ago 3.06GB
<Control_Plane_IP>:30003/intel/amr-fleet-management latest
e91bf2815f65 2 days ago 1.79GB
<Control_Plane_IP>:30003/intel/amr-vda-navigator latest
499c0c09b685 2 days ago 2.08GB
<Control_Plane_IP>:30003/intel/amr-vda5050-ros2-bridge latest
4e8282a666be 2 days ago 2.06GB
<Control_Plane_IP>:30003/intel/eclipse/zenoh-bridge-dds 0.5.0-beta.9
1a5e41449966 9 months ago 86.1MB
<Control_Plane_IP>:30003/intel/node-feature-discovery v0.9.0
00019dda899b 13 months ago 123MB
NOTE Pod deployment may take a while because of the size of the Docker* containers from the pod.
If you get an error after the deployment, wait a few minutes. The pods automatically restart, and the
error goes away. If the error persists after a few automatic restarts, restart the pod manually from the
control plane:
$ docker ps
194
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
86184dab6d92 10.237.22.39:30003/intel/amr-ros-base-camera-tf "/bin/bash -c 'sourc…"
About a minute ago Up About a minute k8s_amr-ros-base-teleop_wandering-
deployment-86d6b669d6-rzlgr_wandering_c00ecd97-2217-4f4f-a62c-9f99bc44ac7d_1
9d19c163076f 10.237.22.39:30003/intel/amr-wandering "/bin/bash -c 'sourc…"
About a minute ago Up About a minute k8s_amr-wandering_wandering-
deployment-86d6b669d6-rzlgr_wandering_c00ecd97-2217-4f4f-a62c-9f99bc44ac7d_0
b9f03850310e 10.237.22.39:30003/intel/amr-nav2 "/bin/bash -c 'sourc…"
About a minute ago Up About a minute k8s_amr-nav2_wandering-deployment-86d6b669d6-
rzlgr_wandering_c00ecd97-2217-4f4f-a62c-9f99bc44ac7d_0
8fb3fb882505 10.237.22.39:30003/intel/amr-fastmapping "/bin/bash -c 'sourc…"
About a minute ago Up About a minute k8s_amr-fastmapping_wandering-
deployment-86d6b669d6-rzlgr_wandering_c00ecd97-2217-4f4f-a62c-9f99bc44ac7d_0
1f122686f8e1 10.237.22.39:30003/intel/amr-collab-slam "/bin/bash -c 'sourc…"
About a minute ago Up About a minute k8s_amr-collab-slam_wandering-
deployment-86d6b669d6-rzlgr_wandering_c00ecd97-2217-4f4f-a62c-9f99bc44ac7d_0
ee7e6cd8b50a 10.237.22.39:30003/intel/amr-aaeon-amr-interface "/bin/bash -c 'sourc…"
About a minute ago Up About a minute k8s_amr-aaeon-amr-interface_wandering-
deployment-86d6b669d6-rzlgr_wandering_c00ecd97-2217-4f4f-a62c-9f99bc44ac7d_0
009efc5405af 10.237.22.39:30003/intel/amr-ros-base-camera-tf "/bin/bash -c 'sourc…"
About a minute ago Up About a minute k8s_amr-ros-base-camera-tf_wandering-
deployment-86d6b669d6-rzlgr_wandering_c00ecd97-2217-4f4f-a62c-9f99bc44ac7d_0
1a6409b8c361 10.237.22.39:30003/intel/amr-realsense "/bin/bash -c 'sourc…"
About a minute ago Up About a minute k8s_amr-realsense_wandering-
deployment-86d6b669d6-rzlgr_wandering_c00ecd97-2217-4f4f-a62c-9f99bc44ac7d_0
15. After the Onboarding process is finished, the Firmware Update and Operating System Update are
triggered automatically. If you want to start the update manually, see OTA Updates.
Hosts Cleanup
Warning Doing these steps erases most of the work done in previouse steps, so only do these steps
when you want to clean up your machines.
To remake a setup after these cleanup steps, restart the onboarding process from the beginning.
1. Robot
195
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
4. Robot If the Robot was added to the Intel® Smart Edge Open cluster, remove it:
kubeadm reset
systemctl restart kubelet
5. Robot If the docker images are already running, remove these images:
rm -rf /etc/tc
Troubleshooting
1. If a docker-compose error is encountered while building the FDO docker images, update the docker-
compose version:
FDO References
Term Reference
DMS N/A
FDO https://fidoalliance.org/intro-to-fido-device-onboard/
FIDO https://en.wikipedia.org/wiki/FIDO_Alliance
RV https://fidoalliance.org/specs/FDO/FIDO-Device-Onboard-RD-
v1.0-20201202.html
The basic fleet management solution consists of server and client architecture.
• For server setup which is orchestrated by Intel® Smart Edge Open, see the Get Started Guide for Robot
Orchestration.
• For client setup in which Intel® Smart Edge Open onboards and deploys devices for basic fleet
management use cases, see Device Onboarding End-to-End Use Case.
The following diagram presents the architecture, components, and communication between components.
196
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
197
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
For the remote inference use case, the requests from ROS 2 node go to OpenVINO™ model server (https://
github.com/openvinotoolkit/model_server/tree/main/extras/nginx-mtls-auth) via SSL channel.
• Basic Fleet Management Use Case
• Remote Inference End-to-End Use Case
198
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
199
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
b. Replace the default command with the following VDA5050 json format embedded command:
200
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
{
"nodeId":"A",
"sequenceId": 7,
"released": true,
"nodePosition":{
"x":0.3,
"y":0.8,
"theta":0,
"mapId": "001"
},
"actions":[]
},
{
"nodeId":"B",
"sequenceId": 7,
"released": true,
"nodePosition":{
"x":0.9,
"y":0.8,
"theta":0,
"mapId": "001"
},
"actions":[]
}
],
"edges":[
{
edgeId": "edge9",
"sequenceId": 0,
"edgeDescription": "edge1",
"released": false,
"startNodeId": "Origin",
"endNodeId": "AnotherNode",
"maxSpeed": 0.0,
"maxHeight": 0.0,
"minHeight": 0.0,
"orientation": 0.0,
"direction": "straight",
"rotationAllowed": true,
"maxRotationSpeed": 0.0,
"length": 0.0,
"trajectory": {
"degree": 0.0,
"knotVector": [
0.0,
0.5,
0.6,
1.0
],
"controlPoints": [
{
"x": 0.0,
"y": 0.0,
"weight": 0.0
}]},
"actions": []}]
}</data> </custom></manifest>
c. Click Send. Intel® In-Band Manageability forwards the message to the VDA5050 client.
201
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Expected result: A status update is sent to the ThingsBoard* Dashboard Event Log, and the robot starts
navigating to the navigation goals.
3. Send a command to update the navigation goals by adding a new one “C”.
NOTE The orderID remains the same; orderUpdateId flags that this as an update from the previous
instructions.
202
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
"mapId": "001"},
"actions":[]}],
"edges":[
{
"edgeId": "edge2",
"sequenceId": 0,
"edgeDescription": "c",
"released": false,
"startNodeId": "B",
"endNodeId": "C",
"maxSpeed": 0.0,
"maxHeight": 0.0,
"minHeight": 0.0,
"orientation": 0.0,
"direction": "straight",
"rotationAllowed": true,
"maxRotationSpeed": 0.0,
"length": 0.0,
"trajectory": {
"degree": 0.0,
"knotVector": [
0.0,
0.5,
0.6,
1.0],
"controlPoints": [{
"x": 0.0,
"y": 0.0,
"weight": 0.0}]
},
"actions": []}]
}</data> </custom></manifest>
c. Click Send. Intel® In-Band Manageability forwards the message to the VDA5050 client.
Expected result: A status update is sent to the ThingsBoard* Dashboard Event Log, and the robot starts
navigating to the navigation goals, now including the C navigation goal.
Command the Robot Back to the Docking Station When its Battery Reaches the 40% Threshold
Collaboration Diagram
When a robot’s battery level is less than 40%, basic fleet management tells the robot to move to the origin
position. The following diagram depicts the steps.
203
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
NOTE VNC interferes with the Intel® Smart Edge Open installation. Intel recommends that you open
the basic fleet management dashboard on a different system, as the dashboard is accessible via
internet.
If the fleet management server dashboard is not accessible on a system in the same network, check
Troubleshooting for Robot Orchestration Tutorials, “Fleet Management Server Dashboard over LAN
Issues”.
204
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
The following home page is loaded. Device Profiles and Devices are loaded with pre-configured data
from Intel.
205
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
206
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
b. Intel added to the Rule Chain to fulfill the following use cases:
• Basic fleet management
• Remote inference
• Onboarding
• OTA update
207
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
208
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE To add new clients to the fleet management server, see Troubleshooting for Robot Orchestration
Tutorials, “Add New Clients to the Fleet Management Server”.
b. The dashboard shows the device’s basic information and telemetry data, for example:
• The INB_Fleet_Management_Client device is currently online (if a device is onboarded through
the onboarding process, an additional device is shown to be online.).
• The battery status is numeric and presented as an average value.
209
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
• The battery level is 35 (hover over the grey line representing battery value by time to see
this).
5. Check the Wandering application logs when the battery level goes under 40%.
The logs are similar to this:
210
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
This tutorial describes how to use the basic fleet management server to set object detection inference on EI
for AMR remotely at the OpenVINO™ model server when its battery is lower than the 60% threshold. If the
battery is equal to or greather than 60%, the inference is set to be done locally at EI for AMR.
Prerequisites:
• The server is configured with the Get Started Guide for Robot Orchestration.
• The robot is onboarded with the Device Onboarding End-to-End Use Case.
Collaboration Diagram
When a robot’s battery level is less than 60%, basic fleet management tells the robots to do Remote
Inference. When the battery level is back to equal or greater than 60%, basic fleet management tells the
robots to do Local Inference. The following diagram depicts the steps.
[object_detection_node-3]
211
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
[object_detection_node-3]
[object_detection_node-3]
[object_detection_node-3]
OTA Updates
The OTA updates solution is based on the Basic Fleet Management architecture. The following diagram
presents the architecture, components, and communications for these use cases.
212
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Prerequisites:
• The server is configured with the Get Started Guide for Robot Orchestration.
• The robot is onboarded with the Device Onboarding End-to-End Use Case.
213
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
214
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE If the operating system update fails, dpkg may have been interrupted in the past or the SOTA
cache directory is missing on the robot. Run the following commands to solve the issue:
Firmware Update
This example updates the Intel® RealSense™ camera firmware.
1. Preparation for the Intel® RealSense™ camera firmware update:
a. Download the firmware from https://dev.intelrealsense.com/docs/firmware-releases.
b. Place the .bin file that contains the firmware in a .tar.gz archive. Make sure that you do not
archive the entire directory, only the .bin file.
c. Set up a basic HTTP server, and upload the .tar.gz on it as a trusted repository server.
2. On the ThingsBoard* dashboard, click Trigger Config Update.
3. Choose:
• Command: append
• Path: trustedRepositories:http://url-to-http-server/and-optional-path-if-necessary/
215
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
4. Click Send, and observe the logs in the ThingsBoard* bottom screen.
5. Trigger the firmware update.
• BIOS Version: any number
216
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• Fetch: http://url-to-http-server/and-optional-path-if-necessary/arhive-with-firmware.tar.gz
• Manufacturer: set the value according to the following image
• Product: set the value according to the following image
• Release Date: the current date in the YYYY-MM-DD format
• Vendor: set the value according to the following image
• Server Username and Server Password: only used if the HTTP server is password protected
217
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
218
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
6. Click Send, and observe the logs in the ThingsBoard* bottom screen.
The client host reboots after the update complete.
Manifest Update to Trigger POTA (Operating system update and Firmware update)
Programming Over The Air, or POTA, includes both SOTA and FOTA.
NOTE A configuration update is still required if the image URL is not listed on trusted repository.
1. Trigger a POTA by clicking Manifest Update. Replace the default manifest with your text.
Example:
219
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
220
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Container Update
Because the containers (or applications) are orchestrated and deployed by Intel® Smart Edge Open, the
ThingsBoard* Rule Chain routes AOTA trigger requests to the Intel® Smart Edge Open server. AOTA only
works for whole pod updates.
1. Verify the current version:
$ helm list -A
$ mdkir -p /var/amr/helm-charts/
3. If you are onboarding with a Cogniteam* Pengo robot, make a copy of the AMR_server_containers/
01_docker_sdk_env/docker_orchestration/ansible-playbooks/01_amr/onboarding-pengo directory:
$ cd <edge_insights_for_amr_path>/Edge_Insights_for_Autonomous_Mobile_Robots_<version>/
AMR_server_containers/01_docker_sdk_env/docker_orchestration/ansible-playbooks/01_amr/
$ cp -r onboarding-pengo onboarding-pengo2
4. To create a new version of a Helm* chart, increment the version field in the Chart.yaml file:
$ nano onboarding_pengo2/helm_onboarding_pengo/Chart.yaml
$ cat onboarding_pengo2/helm_onboarding_pengo/Chart.yaml
apiVersion: v2
appVersion: 2022.3.0
description: A helm chart for onboarding-pengo
221
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
name: onboarding-pengo
type: application
**version: 0.1.20**
5. Move the new Helm* chart to /var/amr/helm-charts/.
$ mv onboarding_pengo2 /var/amr/helm-charts
6. On the ThingsBoard* dashboard, click Trigger AOTA.
7. Select:
222
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• App: application as App
• Command: update
• Container Tag: onboarding-pengo
• Version:
223
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
8. Click Send, and observe the logs in the ThingsBoard* bottom screen.
Known issue: A timeout error occurs after clicking Send because the AOTA request message is rerouted
to the Intel® Smart Edge Open server, and the robot never receives the request.
Expected results:
• On the Intel® Smart Edge Open server, check the mqtt_aota service status. It should look similar to
the following.
224
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Tasks: 2 (limit: 618670)
Memory: 15.2M
CGroup: /system.slice/mqtt_aota.service
└─934368 /usr/bin/python3 /usr/local/bin/mqtt_onboard_aota.py
$ helm list -A
NAME NAMESPACE REVISION
UPDATED STATUS CHART
APP VERSION
cadvisor telemetry 1 2022-09-13 17:14:17.731573771
+0300 EEST deployed cadvisor-0.1.0 1
cert-manager cert-manager 1 2022-09-13 17:05:37.960015552
+0300 EEST deployed cert-manager-v1.6.1 v1.6.1
collab collab 1 2022-09-14 11:41:31.420169905
+0300 EEST deployed collab-0.1.0 2022.3.0
collab-dds-router collab-dds-router 1 2022-09-14 11:40:51.67619388
+0300 EEST deployed collab-dds-router-0.1.0 0.5.0-beta.9
collectd telemetry 1 2022-09-13 17:14:10.903887379
+0300 EEST deployed collectd-0.1.0 1
fleet fleet-management 1 2022-09-14 13:23:47.082994841
+0300 EEST deployed fleet-0.2.0 2022.3.0
grafana telemetry 1 2022-09-13 17:15:27.339867512
+0300 EEST deployed grafana-6.16.13 8.2.0
harbor-app harbor 1 2022-09-13 17:06:59.18688859
+0300 EEST deployed harbor-1.7.4 2.3.4
nfd-release smartedge-system 1 2022-09-13 17:12:20.988514637
+0300 EEST deployed node-feature-discovery-0.2.0 v0.9.0
onboarding onboarding 1 2022-09-14 15:04:07.609923463
+0300 EEST deployed onboarding-0.1.0 2022.3.0
**onboarding-pengo onboarding-pengo 5 2022-09-17 15:38:26.58658014
+0300 EEST deployed onboarding-pengo-0.1.20 2022.3.0**
ovms ovms-tls 1 2022-09-14 11:29:31.642868313
+0300 EEST deployed ovms-tls-0.2.0 2022.3.0
prometheus telemetry 1 2022-09-13 17:13:40.166305066
+0300 EEST deployed prometheus-14.9.2 2.26.0
statsd-exporter telemetry 1 2022-09-13 17:14:01.640302724
+0300 EEST deployed prometheus-statsd-exporter-0.4.10.22.1
225
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Setting a Static IP
Depending on your network setup, there are multiple ways to set a static IP.
• In a home network, see your router’s instructions for how to set a static IP using your MAC address.
• In a corporate network, contact your local support team for how to set a static IP.
• To set it from your computer’s operating system:
1. Make sure that your computer has the correct date:
date
If the date is incorrect, contact your local support team for help setting the correct date and time.
2. Find the gateway:
virtualenv Error
If the following error is displayed:
Virtualenv location:
Warning: There was an unexpected error while activating your virtualenv. Continuing anyway…
Traceback (most recent call last):
File "./deploy.py", line 24, in <module>
from scripts import log_all
ImportError: cannot import name 'log_all' from 'scripts' (/home/test/.local/lib/python3.8/site-
packages/scripts/__init__.py)
Remove the ~/.local/lib/python3.8/ directory and run the following commands:
Python Error
If the following error is displayed:
Failed to install wget. b' ERROR: Command errored out with exit status 1:\n
command: /usr/bin/python3 -c \'import sys, setuptools, tokenize; sys.argv[0] = \'"\'"\'/tmp/pip-
install-6hcmet6a/wget/setup.py\'"\'"\'; __file__=\'"\'"\'/tmp/pip-install-6hcmet6a/wget/setup.py
\'"\'"\';f=getattr(tokenize, \'"\'"\'open\'"\'"\', open)
(__file__);code=f.read().replace(\'"\'"\'\\r\\n\'"\'"\', \'"\'"\'\\n
\'"\'"\');f.close();exec(compile(code, __file__, \'"\'"\'exec\'"\'"\'))\' egg_info --egg-
base /tmp/pip-pip-egg-info-7_3nl4xa\n cwd: /tmp/pip-install-6hcmet6a/wget/\n Complete
output (17 lines):\n Traceback (most recent call last):\n File "<string>", line 1, in
<module>\n File "/tmp/pip-install-6hcmet6a/wget/setup.py", line 15, in <module>\n
setup(\n File "/usr/local/lib/python3.8/dist-packages/setuptools/_distutils/core.py", line
147, in setup\n _setup_distribution = dist = klass(attrs)\n File "/usr/local/lib/
python3.8/dist-packages/setuptools/dist.py", line 476, in __init__\n
_Distribution.__init__(\n File "/usr/local/lib/python3.8/dist-packages/setuptools/
_distutils/dist.py", line 280, in __init__\n self.finalize_options()\n File "/usr/
local/lib/python3.8/dist-packages/setuptools/dist.py", line 899, in finalize_options\n
for ep in sorted(loaded, key=by_order):\n File "/usr/local/lib/python3.8/dist-packages/
setuptools/dist.py", line 898, in <lambda>\n loaded = map(lambda e: e.load(), filtered)
\n File "/usr/local/lib/python3.8/dist-packages/setuptools/_vendor/importlib_metadata/
__init__.py", line 196, in load\n return functools.reduce(getattr, attrs, module)\n
AttributeError: type object \'Distribution\' has no attribute \'_finalize_feature_opts\'\n
226
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
----------------------------------------\nERROR: Command errored out with exit status 1: python
setup.py egg_info Check the logs for full command output.\nWARNING: You are using pip version
20.2.4; however, version 22.2.2 is available.\nYou should consider upgrading via the \'/usr/bin/
python3 -m pip install --upgrade pip\' command.\n'
Remove the ~/.local/lib/python3.8/ directory, and run the following commands:
termcolor Error
If the following error is displayed:
rm /tmp/IntelSHA2RootChain-Base64.zip
update-ca-certificates
227
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
MSG:
MSG:
MSG:
# Install isecl attestation components (TA, ihub, isecl k8s controller and scheduler extension)
platform_attestation_node: false
MSG:
MSG:
228
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
MSG:
sriovnetwork.sriovnetwork.openshift.io/sriov-vfio-network-c1p1 unchanged
STDERR:
Error from server (no supported NIC is selected by the nicSelector in CR sriov-netdev-net-c0p0):
error when creating "sriov-netdev-net-c0p0-sriov_network_node_policy.yml": admission webhook
"operator-webhook.sriovnetwork.openshift.io" denied the request: no supported NIC is selected by
the nicSelector in CR sriov-netdev-net-c0p0
Error from server (no supported NIC is selected by the nicSelector in CR sriov-netdev-net-c1p0):
error when creating "sriov-netdev-net-c1p0-sriov_network_node_policy.yml": admission webhook
"operator-webhook.sriovnetwork.openshift.io" denied the request: no supported NIC is selected by
the nicSelector in CR sriov-netdev-net-c1p0
Error from server (no supported NIC is selected by the nicSelector in CR sriov-vfio-pci-net-
c0p1): error when creating "sriov-vfio-pci-net-c0p1-sriov_network_node_policy.yml": admission
webhook "operator-webhook.sriovnetwork.openshift.io" denied the request: no supported NIC is
selected by the nicSelector in CR sriov-vfio-pci-net-c0p1
Error from server (no supported NIC is selected by the nicSelector in CR sriov-vfio-pci-net-
c1p1): error when creating "sriov-vfio-pci-net-c1p1-sriov_network_node_policy.yml": admission
webhook "operator-webhook.sriovnetwork.openshift.io" denied the request: no supported NIC is
selected by the nicSelector in CR sriov-vfio-pci-net-c1p1
Update the ~/dek/inventory/default/group_vars/all/10-default.yml file with:
sriov_network_operator_enable: false
MSG:
AnsibleError: Unexpected templating type error occurred on (# SPDX-License-Identifier: Apache-2.0
# Copyright (c) 2020 Intel Corporation
apiVersion: v1
kind: ConfigMap
metadata:
name: grafana-datasources
namespace: telemetry
labels:
grafana_datasource: '1'
229
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
data:
prometheus-tls.yaml: |-
apiVersion: 1
datasources:
- name: Prometheus-TLS
access: proxy
editable: true
orgId: 1
type: prometheus
url: https://prometheus:9099
withCredentials: true
isDefault: true
jsonData:
tlsAuth: true
tlsAuthWithCACert: true
secureJsonData:
tlsCACert: |
{{ telemetry_root_ca_cert.stdout | trim | indent(width=13, indentfirst=False) }}
tlsClientCert: |
{{ telemetry_grafana_cert.stdout | trim | indent(width=13, indentfirst=False) }}
tlsClientKey: |
{{ telemetry_grafana_key.stdout | trim | indent(width=13, indentfirst=False) }}
version: 1
editable: false
): do_indent() got an unexpected keyword argument 'indentfirst'
Update the ~/dek/roles/telemetry/grafana/templates/prometheus-tls-datasource.yml file with:
230
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Installation Stuck
If the installation remains stuck with the following log:
MSG:
Status: active
Logging: on (low)
Default: deny (incoming), allow (outgoing), deny (routed)
New profiles: skip
To Action From
-- ------ ----
22/tcp ALLOW IN Anywhere
22/tcp (v6) ALLOW IN Anywhere (v6)
Type Ctrl-c, and restart the installation. (Run the ./deploy.sh script again.)
docker-compose Failure
If you see an error message that docker-compose fails with some variables not defined, add the
environment variables to .bashrc so that they are available to all terminals:
export DOCKER_BUILDKIT=1
export COMPOSE_DOCKER_CLI_BUILD=1
export DOCKER_HOSTNAME=$(hostname)
export DOCKER_USER_ID=$(id -u)
export DOCKER_GROUP_ID=$(id -g)
export DOCKER_USER=$(whoami)
# Check with command
env | grep DOCKER
231
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
ansible-playbook AMR_server_containers/01_docker_sdk_env/docker_orchestration/ansible-playbooks/
02_edge_server/fleet_management/fleet_management_playbook_uninstall.yaml
2. After uninstalling the playbook, wait several seconds for all fleet related containers to stop. Verify that
there are no fleet containers running:
ansible-playbook AMR_server_containers/01_docker_sdk_env/docker_orchestration/ansible-playbooks/
02_edge_server/fleet_management/fleet_management_playbook_install.yaml
NOTE This only restarts the ThingsBoard* server, without Intel® Smart Edge Open.
• Reset the database to the preconfigured state (with customizations from Intel), and restart the server:
232
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE This only restarts the ThingsBoard* server, without Intel® Smart Edge Open.
• When you deploy the ThingsBoard* container using Intel® Smart Edge Open Ansible* playbook,
sometimes the server cannot start due to following error:
ansible-playbook AMR_server_containers/01_docker_sdk_env/docker_orchestration/ansible-playbooks/
02_edge_server/fleet_management/fleet_management_playbook_uninstall.yaml
ansible-playbook AMR_server_containers/01_docker_sdk_env/docker_orchestration/ansible-playbooks/
02_edge_server/fleet_management/fleet_management_playbook_install.yaml
Result: The database is reset to the preconfigured database provided by Intel.
If COMMAND PASSED is displayed, then you should configure your browser to NOT use proxy when accessing
the IP/hostname of the server.
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Logging into 10.237.22.88:30003 for
user admin
failed - 500 Server Error for http+docker://localhost/v1.41/auth: Internal Server Error
(\"Get \"https://10.237.22.88:30003/v2/\": dial tcp 10.237.22.88:30003: connect: connection
refused\")"}
1. Wait two minutes until the server is up and running.
2. Verify that all pods are running and no errors are reported:
233
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
3. After all pods and services are up and running, restart the basic fleet management server:
ansible-playbook AMR_server_containers/01_docker_sdk_env/docker_orchestration/ansible-playbooks/
02_edge_server/fleet_management/fleet_management_playbook_install.yaml
python
>>> import psutil
>>> battery = psutil.sensors_battery()
>>> print("Battery percentage : ", battery.percent)
Battery percentage : 43
When the battery bridge is installed in robot, the 2 commands below are equivalent. So when you launch
kobuki node, it publishes battery percentage in topic /sensors/battery_state. You can also do the same using
the ros2 topic pub command.
234
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
235
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
For configuring new basic fleet management clients (1-to-1 mapping), the new tokens of the new Devices
can be retrieved with Copy access token.
cd components/amr_battery_bridge_kernel_module/src/
# uninstall battery-bridge-kernel-module
sudo ./module_install.sh -u
# check if below path exists
ls /sys/class/power_supply/BAT0
If the above path exists, then there is another kernel module occupying the place already and provided
battery-bridge-kernel-module can not be installed.
In this case, the provided solution does work.
Overview
Intel® Edge Software Device Qualification (Intel® ESDQ) for EI for AMR provides customers with the capability
to run an Intel provided test suite at the target system, with the goal of enabling partners to determine their
platform’s compatibility with the EI for AMR.
The target of this self certification suite is the EI for AMR compute systems. These platforms are the brain of
the Robot Kit. They are responsible to get input from sensors, analyze them, and give instructions to the
motors and wheels to move the EI for AMR.
How It Works
The EI for AMR Test Modules interacts with the Intel® ESDQ CLI through a common test module interface
(TMI) layer which is part of the Intel® ESDQ binary. Intel® ESDQ generates a complete test report in HTML
format, along with detailed logs packaged as one zip file, which you can manually choose to email to:
[email protected]
NOTE Each test and its pass/fail criteria is described below. To jump to the installation process, go to
Download and Install Intel® ESDQ for EI for AMR.
Intel® ESDQ for EI for AMR contains the following test modules.
• Docker* Container
This module verifies that the EI for AMR comes as a Docker* container and it can run on the target
platform.
236
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
For more information, go to the Docker* website.
The test is considered Pass if:
• The Docker* container can be opened.
• Intel® RealSense™ Camera
This module verifies the capabilities of the Intel® RealSense™ technology on the target platform.
For more information, go to the Intel® RealSense™ website.
The tests within this module verify that the following features are installed properly on the target platform
and that EI for AMR and the Intel® RealSense™ camera are functioning properly:
• The camera is detected and is working.
• Intel® RealSense™ SDK.
The tests are considered Pass if:
• The Intel® RealSense™ SDK 2.0 libraries are present in Docker* container.
• A simple C++ file can be compiled using g++ and -lrealsense2 flag.
• Intel® RealSense™ Topics are listed and published.
• The number of FPS (Frames Per Second) are as expected.
• Intel® VTune™ Profiler
This module runs the Intel® VTune™ Profiler on the target system.
For more information, go to the Intel® VTune™ Profiler website.
The test is considered Pass if:
• VTune™ Profiler runs without errors.
• VTune™ Profiler collects Platform information.
• rviz2 and FastMapping
This module runs the FastMapping application (the version of octomap optomized for Intel) on the target
system and uses rviz2 to verify that it works as expected.
For more information, go to the rviz wiki.
The test is considered Pass if:
• FastMapping is able to create a map out of a pre-recorded ROS 2 bag.
• Turtlesim
This module runs the Turtlesim ROS2 application on the target system and checks if it works as expected.
For more information, go to the Turtlesim wiki.
The test is considered Pass if:
• Turtlesim opens and runs without error.
• Intel® oneAPI Base Toolkit
This module verifies some basic capabilities of Intel® oneAPI Base Toolkit on the target platform.
For more information, go to the Intel® oneAPI Base Toolkit website.
The tests within this module verify that the following features are functioning properly on the target
platform:
• DPC++ compiler
• CUDA to DPC++ converter
This test is considered Pass if:
• A simple C++ file can be compiled using the DPC++ compiled and it runs as expected.
• CUDA can be installed.
237
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
238
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
• No Error messages are displayed while running the gst-launch command.
This test may FAIL, or it may be skipped if the target system does not have a Intel® RealSense™ Camera
connected.
• ADBSCAN
This module verifies if the ADBSCAN algorithm works on the target system.
The test is considered Pass if:
• The ADBSCAN algorithm works on the target system.
• Collaborative Visual SLAM
This module verifies if the collaborative visual SLAM algorithm works on the target system.
The test is considered Pass if:
• The collaborative visual SLAM algorithm works on the target system.
• Kudan visual SLAM
This module verifies if the Kudan visual SLAM algorithm works on the target system.
The test is considered Pass if:
• The Kudan visual SLAM algorithm works on the target system.
Get Started
This tutorial takes you through installing the Intel® ESDQ CLI tool, which is installed as part of EI for AMR.
Refer to How It Works before starting the installation. To use this tutorial, you must Download and Install
Intel® ESDQ for EI for AMR.
239
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
240
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
f. Make sure that AMR Bag Files and AMR Kudan SLAM are selected:
241
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
g. Click Next until you get to the Download page, and click on Download.
2. Follow the steps in the Get Started Guide for Robots to extract and install EI for AMR.
3. Run your target self-certification. Intel® ESDQ for EI for AMR contains three types of self-certifications:
• Run the Self-Certification Application for Compute Systems for certifying Intel® based compute
systems with the EI for AMR software
• Run the Self-Certification Application for RGB Cameras for certifying RGB cameras with the EI for
AMR software
• Run the Self-Certification Application for Depth Cameras for certifying depth cameras with the EI for
AMR software
cd $HOME/edge_software_device_qualification/Edge_Software_Device_Qualification_For_AMR_*/esdq
242
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
2. Unzip the ROS 2 bags used in the tests:
export ROS_DOMAIN_ID=19
./esdq run -r
Expected output (These results are for illustration purposes only.)
243
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
244
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
NOTE The OpenVINO™ Object Detection Myriad Test Failure above is shown for demonstration
purposes only. The test is expected to pass.
NOTE The following steps use the Intel® RealSense™ ROS 2 node as an example. You must change the
node to your actual camera ROS 2 node.
cd $HOME/edge_software_device_qualification/Edge_Software_Device_Qualification_For_AMR_*/esdq
2. Start the the sensor ROS 2 node:
a. Replace the commands with the commands you use to open up the RGB camera for the
certification Docker* container. If there is no Docker* container, run the RGB camera node in a
ROS 2 environment after setting the ROS_DOMAIN_ID.
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
docker-compose -f 01_docker_sdk_env/docker_compose/01_amr/amr-sdk.all.yml run realsense bash
b. Replace the commands with the commands you use to open up the RGB camera for the
certification ROS 2 node.
source ros_entrypoint.sh
# set a unique id here that is used in all terminals
export ROS_DOMAIN_ID=19
ros2 launch realsense2_camera rs_launch.py &
c. The self-certification test expects the camera stream to be on the “/camera/color/image_raw”
topic. This topic must be visible in rviz2 using the “camera_color_frame” fixed frame. If your
camera ROS 2 node does not stream to that topic by default, use ROS 2 remapping to publish to
that topic.
export ROS_DOMAIN_ID=19
./esdq run -r -p "sensors_rgb"
NOTE The following steps use the Intel® RealSense™ ROS 2 node as an example. You must change the
node to your actual camera ROS 2 node.
cd $HOME/edge_software_device_qualification/Edge_Software_Device_Qualification_For_AMR_*/esdq
245
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
source ./01_docker_sdk_env/docker_compose/common/docker_compose.source
export CONTAINER_BASE_PATH=`pwd`
docker-compose -f 01_docker_sdk_env/docker_compose/01_amr/amr-sdk.all.yml run realsense bash
b. Replace the commands with the commands you use to open up the depth camera for the
certification ROS 2 node.
source ros_entrypoint.sh
# set a unique id here that is used in all terminals
export ROS_DOMAIN_ID=19
ros2 launch realsense2_camera rs_launch.py pointcloud.enable:=true &
c. The self-certification test expects the camera stream to be on the “/camera/depth/color/points”
and “/camera/depth/image_rect_raw” topics. These topics must be visible in rviz2 using the
“camera_link” fixed frame. If your camera ROS 2 node does not stream to that topic by default,
use ROS 2 remapping to publish to that topic.
export ROS_DOMAIN_ID=19
./esdq run -r -p "sensors_depth"
Troubleshooting
For issues, go to: Troubleshooting for Robot Tutorials.
Support Forum
If you’re unable to resolve your issues, contact the Support Forum.
Security
This section highlights the security features offered by the Edge Insights for Autonomous Mobile Robots (EI
for AMR) platform and provides an overview of the security features. For further reading, refer to the specific
documents listed below.
246
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Due to architecture constraints, only the developer of the application code can implement the matching shim
layer correctly.
The following picture shows a potential implementation of a shim layer around the customer application like a
shell.
247
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
248
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Keep in mind, complex checks and more layers might have an impact on the overall system performance.
In general, it is highly recommended to check regularly for updates and vulnerabilities on the component
web sites.
Authentication
Authentication helps to develop a secure system. A run-time authentication system is the next step following
secure boot. Any program code can be authenticated before it is executed by the system. This powerful tool
enables AMR suppliers to guarantee a level of security, and safety during run-time. Executing code from an
unknown source or malware wouldn’t be possible.
The Intel® Dynamic Application Loader (Intel® DAL) is a feature of Intel® platforms that allows you to run
small portions of Java* code on Intel® Converged Security and Management Engine (Intel® CSME) firmware.
Intel has developed DAL Host Interface Daemon (also known as JHI), which contains the APIs that enable a
Linux* operating system to communicate with Intel DAL. The daemon is available both in a standalone
software package and as part of the Linux* Yocto 64-bit distribution.
More information about the described use cases and features can be found in the following documents:
249
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Virtualization
Virtualization is another important element to increase the level of security and safety. It helps to establish
freedom from interference (FFI), as it’s requested for safety use cases, and workload consolidation. Intel
devices have supported this use case with Intel® Virtualization Technology (Intel® VT) for decades.
More information about the described use cases and features can be found in the following documents:
Encryption
Encryption is required for many security use cases. The EI for AMR platform supports the common encryption
algorithms like AES or RSA in hardware. This increases the encryption/decryption performance and the
security level. Typical use cases are the encryption of communication messages, a file system, or single files
for IP protection or the creation of a secure storage for security relevant data like crypto keys or passwords.
Another use case is memory encryption; the EI for AMR platform supports this with the Total Memory
Encryption (TME) feature.
More information about the described use cases and features can be found in the following documents:
Firmware Update
250
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
To improve the security and safety status over the lifetime of a device, the internal firmware (e.g. BIOS)
must be updatable. In this case the update packages are signed by the supplier (e.g. Intel, OEM etc.).
More information about the described use cases and features can be found in the following document:
Secure Debug
Debugging is an important feature during product development. During in-field usage, debugging might also
be needed to analyze field returns. To prevent anyone from accessing internal resources via the debugger, a
secure debugging system is developed. In this case an engineer who wants to use the debugger has to
authenticate via a valid token which has to be offered to the system (e.g. storing it in flash). Tokens must be
signed by a key which was stored during manufacturing flow into the device fuses.
More information about the described use cases and features can be found in the following documents:
Real-Time Support
Intel real-time technology supports new solutions that require a high degree of coordination, both within and
across network devices. Intel® Time Coordinated Computing (Intel® TCC)-enabled processors deliver optimal
compute and time performance for real-time applications. Using integrated or discrete Ethernet controllers
featuring IEEE 802.1 Time Sensitive Networking (TSN), these processors can power complex real-time
systems.
For more information, refer to:
• Intel Real-Time Computing IoT Technology Resources
• Intel® Time Coordinated Computing Tools
251
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
The baseline support for Intel® TCC consists of a real-time Linux* kernel and the Intel® TCC Mode in the
BIOS. These features may be sufficient to satisfy many use cases, with initial cycle times possible in the
hundreds of microseconds.
Requirements
• Processor with support for Intel® TCC, such as Intel® i7-1185GRE, i5-1145GRE, and i3-1115GRE
processors (The complete list of compatible devices is on the TCC Tools webpage.)
• BIOS with support for Intel® TCC, such as Intel’s reference BIOS
NOTE The reference BIOS may differ from BIOS versions offered by BIOS vendors. If you cannot find
Intel® TCC Mode in the BIOS, contact your BIOS vendor for more information.
Configure, Build, and Install Intel’s Linux* kernel with Real-Time Support
1. Install required dependencies:
sudo apt -y install build-essential gcc bc bison flex libssl-dev libncurses5-dev libelf-dev
dwarves zstd
2. Clone Intel’s Linux* kernel git repository:
cd linux-intel-lts
4. Reset to the latest 5.15 RT tag:
make olddefconfig
6. Configure the kernel EI for AMR options:
252
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
scripts/config --module CONFIG_PINCTRL_TIGERLAKE
scripts/config --enable CONFIG_INTEL_IPU6_TGLRVP_PDATA
scripts/config --set-str SYSTEM_TRUSTED_KEYS ""
scripts/config --set-str CONFIG_SYSTEM_REVOCATION_KEYS ""
7. Configure the Intel® TCC-related options:
sudo update-grub
13. Reboot the board:
Terminology
Term Description
AI Artificial Intelligence
AT ATtention
253
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Term Description
DL Deep Learning
EOF End-Of-File
FM FastMapping
IE Inference Engine
IP Intellectual Property
JIT Just-In-Time
254
Edge Insights for Autonomous Mobile Robots (EI for AMR) 1
Term Description
NN Neural Network
OS Operating System
RV RendezVous
255
1 Edge Insights for Autonomous Mobile Robots (EI for AMR) Developer Guide
Term Description
256