The sense I get from my research is that this tutorial is intended for the full ZR300 Developer Kit camera. Whilst the Euclid does have a ZR300 too, it is a modified form of the camera that lacks an ASIC chip that the full kit camera has. This missing chip is software-simulated on Euclid by using a special forked version of Librealsense, not the main branch of Librealsense.
My recommendation would be to use the form of SLAM that is built into Euclid. This tutorial on setting up an RTAB-MAP slam on Euclid may be of interest to you.
Euclid also comes with a 6DOF Scenario that makes use of the Euclid SLAM module.
I am not a Euclid specialist, so I cannot say for certain whether "slam_tutorial_1_gui" can work with the default setup of Euclid. One of the Euclid specialists who read and comment on this forum may be able to offer a second opinion later.
I believe you can install the main Librealsense on Euclid in place of the forked version but you will lose access to the Euclid functions that the forked version provides support for. The fisheye camera in particular would likely be severely affected.
The Intel Realsense SLAM is preinstalled on the Euclid. You can use it by running the following 2 nodes:
roslaunch realsense_camera lr200m_nodelet_default.launch
And in a second terminal:
roslaunch realsense_sp sp.launch
This will publish a topics /tf and /realsense/odom which contain the 6DOF information for the Euclid.
Unfortunately, in order to display the map itself, modifications need to be made (since it was originally made for ZR300 and not in a ROS environment).
I would strongly recommend against using a different librealsense version that is not the Euclid's forked version, as that can cause errors in the Euclid's camera's firmware.
I would suggest using GMAPPING, RTAB, or ORBSLAM, for all of which I've made walkthroughs for using, and are available here in the forum or in euclidcommunity.intel.com
Let me know if you need any further help and I'll be happy to help out!
Intel Euclid Development Team
I follow Jay's github: https://github.com/robojay/realsense_sp , and make slam work.
I can get the occupancy map shown on rviz by realsense_sp ROS package.
And I also try the RTABMap GitHub - introlab/rtabmap_ros: RTAB-Map's ROS package.
It works, and I can get the 3D map and the procedure of loop closure.
It's cool. Thanks.
An article on RTABMap that uses Kinect as its camera has a section on recovering lost odometry. The principles may be applicable to your RealSense camera.