Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do planning & perception in the robot frame? #285

Open
DavidB-CMU opened this issue Mar 27, 2016 · 5 comments
Open

Do planning & perception in the robot frame? #285

DavidB-CMU opened this issue Mar 27, 2016 · 5 comments

Comments

@DavidB-CMU
Copy link
Contributor

I had some problems with HERB's pose drifting because it could only see 1 stargazer landmark. I cleaned the sensor lens and the IR stickers and it works better, it's seeing a minimum of 2 markers with the robot in a similar position.

Tom gave us a lot of help in this thread, should we give him a status update?
cra-ros-pkg/robot_localization#221

Also I propose that we do planning and perception activities in the robot's base frame.
I think we should leave the map-to-robot frame transform as something that only concerns the localization and mapping problem.

@DavidB-CMU
Copy link
Contributor Author

Here's a screenshot ~30 seconds apart, showing the large amount of drift:
pose_drift1

It's better now that the stargazer is detecting two landmarks again.

@psigen
Copy link
Member

psigen commented Mar 29, 2016

Perhaps this should be a discussion for herbpy?

Nothing in prpy has knowledge about what the preferred base frame should be. herbpy makes the decision that this should be a "world" frame, and the robot should move in it. However, it could just as easily leave the robot at a static position in the base frame and move other objects around it as HERB moves.

@mkoval
Copy link
Member

mkoval commented Mar 29, 2016

This discussion definitely belongs here because prpy is the home for prpy.perception. This code is used on robots other than HERB.


In response to @DavidB-CMU's original question:

@Shushman already added this functionality to ApriltagsModule in ?? by adding the reference_link parameter to the constructor. We should make the same change to the VNCC, Rock, and SimTrack modules.

However, @cdellin made a argument that we should fix the underlying oscillation, not hack around it by running perception in the robot frame. This work-around will break as soon as we move the base which, hopefully, will become more common with the upcoming hardware charges. Instead, we can unsubscribe to the Stargazer in OpenRAVE when we know the robot is stationary.

@jeking04 Are we already doing this in the table clearing demo?

@psigen
Copy link
Member

psigen commented Mar 29, 2016

I'm confused. Why does prpy.perception have any idea if it is operating in the robot or world base-frame? It seems like any notion of that is encoded in the TF transforms that are in use.

I don't think planning in the robot frame is necessarily a bad idea: it means that object persistence is more challenging, but it avoids compounding localization error with perception error.

But I do not understand why anything in prpy would be specifically tied to anything other than the OpenRAVE world frame, which can be related to the robot base frame in any way that we want. We have consciously chosen to relate HERB -> world via his odometry/the stargazer, but you could also teleport the lab kinbodies around and keep HERB fixed to the world if you wanted.

@jeking04
Copy link
Contributor

First, I agree that we should fix the underlying oscillation.

Second, the reference_link parameter change has been made to VNCC, simtrack and chisel. These vncc/chisel change is working its way through via PR. The simtrack change will be in a seperate PR coming as soon as the chisel one completes.

@psigen I'm not sure if this answers your question, but I'll give it a shot. prpy.perception is where we take data from perception modules and turn it into OpenRAVE kinbody objects. Thus prpy.perception need to know where to put these bodies. We wanted the ability to do exactly what @DavidB-CMU suggested, detect in robot frame rather than world frame. This allows us to hack around the localization error by running herbpy with the base simulated and still detect objects and get them into OpenRAVE in the right place relative to HERB.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants