Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to re-create the blog post. #10

Open
ikuzmych opened this issue Feb 24, 2023 · 6 comments
Open

Unable to re-create the blog post. #10

ikuzmych opened this issue Feb 24, 2023 · 6 comments

Comments

@ikuzmych
Copy link

I've been following this blog post to try and re-create it using an iRobot
https://nnarain.github.io/2021/01/06/iRobot-Create-2-Navigation.html

I am not able to recreate the part where you give two points on a grid and create a point cloud utilizing the bump sensors in simulation.
In the simulation blog post at the very end we ran the roslaunch create_gazebo maze.launch command and opened up the Gazebo how you had it in the blog post, but in the navigation blog post when we run the final command in the Navigation blog post: roslaunch create_gazebo maze.launch nav_mode:=map when the Roomba moves in the GIF on the blog post, it creates a point cloud, but for us we can get it to move in RVIZ but not create the point cloud when it is near a wall, and it is not updating the map based on the borders.

We have this error:
[ERROR] Error computing light sensor poisitions: "right_front_light_sensor_link" passed to lookupTransform argument source_frame does not exist. and it says this for all 6 light sensors.

In your Navigation blog post where you give it a navigation point on the grid and it creates a map using the bump sensors, we are not able to re-create it. When we try roslaunch create_gazebo maze.launch nav_mode:=mapping in RVIZ when we give a navigation point, it tries to get there but it gets blocked by a wall and does not build or map the wall, unlike your GIF where you had a point cloud map the maze.

@nnarain
Copy link
Owner

nnarain commented Feb 24, 2023

Are you using my fork of the create_robot package?

The light sensors are defined in the URDF there:
https://github.com/nnarain/create_robot/blob/melodic-devel/create_description/urdf/create_2.urdf.xacro

@ikuzmych
Copy link
Author

Thank you, we fixed the simulation, and now we want to test it on our actual iRobot. I have my iRobot hooked up to my Pi through a USB-TTL connection. How do I use roslaunch to launch the command to map using the actual iRobot?

@nnarain
Copy link
Owner

nnarain commented Feb 24, 2023

Should be the same command. The idea is to have octomap_server running to start exploring. Also keep in mind the light sensors in gazebo are simulated with ray casts. In real life they aren't as reliable (subject to ambient light an what not). I had limited success with it. You might want to consider getting a cheap-ish off the shelf lidar for something more reliable. I have one of these: https://ca.robotshop.com/products/ydlidar-x2-360-laser-scanner though I've not actually integrated it. Depends on what you want to accomplish.

@ikuzmych
Copy link
Author

ikuzmych commented Feb 24, 2023 via email

@nnarain
Copy link
Owner

nnarain commented Feb 24, 2023

To clarify, launching the nav stack should be the same. You will need to launch the create_driver on the robot

@ToxemicGuitar82
Copy link

Thank you! We were able to implement our lidar and autonomous navigation and have been testing the past few days :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants