-
-
Notifications
You must be signed in to change notification settings - Fork 993
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wrist attachment and standard controller mount points #876
Comments
The SDK Manager is really for identifying core aspects of the VR setup. A wrist attachment point is not part of a core set up. Also it's easy enough to do by just attaching something as a child to the controller script alias (the same way the tooltips are attached to the controller alias). As for the standard set of attachments, this is done via the "Interactable Object" snap handles and cannot be standardised because each object would potentially have a different snap point. No 2 pistols of different makes are held in the same position. |
The SDK Manager might not be the best place, I am thinking of something higher level. Though parenting the wrist to the controller may not be the best place either. I understand you can attach something wrist relative with a transform. However there are two common things / reasons I think it would be a good VRTK feature. Firstly, it would be nice for that transform to be standardized and easy to drop in, instead of every game with a watch/wristband/etc... having to re-discover or copy and paste that transform. Secondly, as I mentioned the wrist as a per-game hardcoded controller-relative transform is not very future-proof. There are a number of future situations where that won't work correctly.
We already know that glove accessories will be coming out soon. We can't write code to handle them yet. But we can at least prepare by making sure that when they come out handling the wrist is a simple case of "update VRTK and tweak an attribute or two on the wrist attachment script to suit your game" instead of "refactor your game so wrist locations aren't hardcoded, parented to the wrong object, etc...". On the topic of standard attachment points, I wasn't referring to the attachment points on grabbable objects. I'm referring to vectors/transforms for standard spots on the controller; which may be useful for scripts or making game elements follow. Not all of the points are grab related. And they aren't the same across platforms. I've drawn up some diagrams to try and illustrate what I am thinking of.
Looking at the OpenVR source, it appears as if some of these transforms are even standard coordinate system components available in SteamVR. However I only see VRTK making use of the tip coordinate system. |
I think it would be awesome for VRTK to support all these "Grab attach points on the controller". Given that the Oculus SDK probably doesn't provide something like that by default we'd have to introduce our own abstract attach points. |
@bddckr Yeah, probably. On the positive side though, Oculus via the SDK only has one type of controller and doesn't support 3rd party controllers because every new tracked object has to be designed and approved by Oculus; so we can just hardcode the transforms in the VRTK/SDK/OculusVR code. Even better, SteamVR may have already done some of the work. If OpenVR outputs those coordinate system components when you plug a Rift+Touch into it then all we have to do is double check the controller origin is the same in OculusVR and SteamVR, then copy and paste the transform values into the OculusVR code. |
This strikes me as something that would be best in two parts. A generic part that belongs in the toolkit and lets users set up arbitrary mount points. And an example scene or two that make use of these to implement things like watches, swords, and wands. I'd be wary of assigning any standard mount points for things like gun barrels and watches as part of the core toolkit. Even the most genetic seeming thing often turns out to be very app specific. Look at the forums for a lot of the shooting games and you'll see people can't agree on the optimal angle between the barrel and the handset, for example. That's why I think the best bet is to include examples rather than building them into core. |
@cameronoltmann Being able to setup arbitrary mount points to connect things to would be a nice extra. The key thing though is that the mount points you create shouldn't be limited to being relative to the origin of the controller, they should be relative to known locations on the controller (handgrip, tip, base, forward vector, button components, ...) which may differ by controller type. I still think the wrist use case needs more special handling than just a controller relative mount point. Especially given the whole wrists on a separate tracked object issue. However, I can compromise on wrists using mount points for now if:
Then the mount points are flexible enough that a "Wrist" point can be attached to a mount point; and in the future if any new controller features come out it's easy to just attach that wrist to a mount point on another controller or make the mount point itself dynamic. This level of flexibility will probably be useful for some other type of feature a game might want to implement; though I can't immediately think of an idea off the top of my head. |
The example scenes would include things like tip/grip/wrist for different controllers. The core toolkit would just need the raw functionality to enable those scenes, which is to say, the ability to define mount points with arbitrary offsets from the controller tip. With that functionality, it's trivial to create mount points that match any position or move in any way you want. And since most of those ways will be application specific, I don't think they belong in the core toolkit, but in the examples. I also think you're unnecessarily complicating the wrist issue. If the wrist is a separate tracked object, it's really got nothing to do with the main controller, and thus nothing to do with the core toolkit. Additionally, since there are no standards beyond headset plus two controllers at this point, there's probably not much point in trying to architect the toolkit for things that don't exist. |
@cameronoltmann tip/handgrip/base are not simple offsets, you can't implement them that way; they are different for each type of controller. If you create a transform for a 'tip' based on the location of the tip on a Vive wand, then the tip will not be at the tip when you use Oculus Touch or Valve's knuckles controllers. And OpenVR understands this concept, defining separate coordinate systems for controllers (see https://github.com/ValveSoftware/openvr/blob/master/headers/openvr.h#L2991). |
They are simple offsets. They just happen to be different for each controller type. And as new controller types are released, there will continue to be new sets of offsets that will make sense to have. And if you have example scenes for the different controller types, people will have good reference material to use when adapting to a new controller. The reason I don't think it makes sense to bake these points directly into the toolkit, but rather to have example scenes, can be summed up by this question: Which of the "standard" offset vectors is appropriate on all controller types to line a bow up with? On the Vive, this is generally along the shaft of the controller. On the Oculus Touch controllers, none of the standard vectors is appropriate. And new controllers will introduce new configurations we haven't thought of yet. So baking in a predefined set of points will only serve to artificially limit a medium that is still made of wide open space and room for exploration. It would be a mistake, in my opinion. |
@cameronoltmann Please, please, look at the OpenVR code I am linking to. tip/handgrip/base are not offsets you hardcode, in SteamVR they are offsets you ask OpenVR for the location of. When the controller is a Vive wand it'll give you one value, when the controller is a Touch it'll give you a different value, and when the knuckles controllers are released you'll get another value without needing to update your game. (On that topic, games should not need to be updated every time a new 3rd party controller with the standard control-set comes out, that is the antithesis of how OpenVR is supposed to work and results in a piles of old games that'll never get updated not working correctly with new controllers). On the topic of bows there are two answers to that:
And the answer to "What 'standard' offset vectors would VRTK let you create mount points relative to?" would be "Every one at least one supported SDK provides". If a transform is not available through SteamVR for a controller, it'll fall back to the origin or perhaps the closest known point. If the Oculus SDK doesn't provide an offset, we'll hardcode the one SteamVR outputs for the Touch controllers (though from the look of the OculusVR folder code, most of these points may already be provided by the SDK). The offsets currently supplied by SteamVR are:
|
Yeah, I see the set of predefined points. I see those as sort of a lowest common denominator. Can you define mount criteria (position and orientation) that will work for all controllers for bow orientation? Sure, it makes sense for the body of the bow to go in the middle of your hand. But which way does it face? |
@cameronoltmann The coordinate information given by OpenVR should be a matrix, which the Unity plugin should be turning into a transform. So each coordinate system/component position OpenVR exposes also comes with an orientation. I can't examine the SteamVR's output right now, so exactly what those orientations are is just theory at the moment:
For a bow, since OpenVR states that handgrip is relative to the plane where the index and thumb are placed: One option would be for the game developer to create a transformed mount point that is offset from the index/thumb so it instead sits in the center of your hand and set the origin/snap on their bows in the center point of the handle and face them forward. Alternatively if you wanted to work more with cylindrical objects like sword grips. You could instead define a mount point relative to the handgrip but rotate it a bit so it roughly fits your grip axis instead. Then you could define the snap point on your objects so they are oriented relative to whatever semi-cylindrical axis you grab the object with. |
My point is that you'll need to offset the built in points by a different amount on different controllers, regardless of what you use as a base. Hence the utility of having more than the built in points as examples. Because different controllers result in different grip poses, which result in different mounting practices being appropriate for some actions. The will be no one-size-fits-all solution. |
@cameronoltmann If you had to vary your offsets on different controllers that would defeat the point of OpenVR having these points in the first place. The handgrip is defined by OpenVR as being a "neutral hand pose, ambidextrous, between the index and thumb". Each controller driver defines where this is on the controller. And then OpenVR tells us where this is. So it doesn't matter what controller you use. Whether you use the Vive wands, Oculus Touch, Valve's knuckles controller, or a 3rd party controller that implements a SteamVR driver; the handgrip component will always tell you where the plane the index finger+thumb sits on and how it is oriented. From there the transform from your index+thumb plane to the center of your hand is the same, it doesn't matter what controller you're holding. Different examples for mounting strategies would be good, but those mount points you create are still the same transform whatever controller you use. |
What I'm saying is that's just not sufficient for some cases. It's a good start but it doesn't give you everything you need for all cases. There will be exceptions. And there will be SDKs that are missing even some of those basic pieces. Anyway, thanks for the links to the OpenVR code. I learned some stuff. |
Well if someone does have a special case that's impossible to do with a standard coordinate component and fixed transform mount point. That suggestion I made for scriptable mount points would allow them to override the mount point transform with some platform or controller specific tweaks. As for SDKs that's probably not an important issue. Oculus only supports one type of controller which can be hardcoded. Though looking at the OculusVR code I'm surprised at how many similar standard mount points it already handles. OSVR's documentation goes from confusing to nonexistent, but from what I can gather it looks as if they may already have something similar to the handgrip position. We could always have a way to supply per-sdk overrides for a mount point's transform (say an override script you drop in on a mount point). In a rare case allowing you to define a mount point with a transform that's not per-controller but per-sdk, taking SDK differences into account by allowing you to define multiple transforms that are relative to SDK specific components. In any case it's probably not a big problem, VRTK is already relying on SDKs providing mount points like these to find the tip, buttons, etc. |
I can imagine having different mount points supplied per SDK and being able to offset those to customize some more. There's an issue that is about changing the way the buttons and axes are mapped to from the various SDKs. The idea is to move the whole per-SDK logic into the SDKs because it already shows that the current way of abstracting to a shared set of buttons and axes has its problems. There is no "touched grip" on the Vive for example. See #842 for more info. Work has already started to dynamically discover installed and VRTK-supported SDKs via #867. The same could be done for mount points. The customization point would be per-Object then. |
To summarize, so far what we've discussed would probably look like this:
Glossary
Problems
User storiesMultiple mount points for object grabThe "Some Random Room Escape" game has replaced the normal controllers in their game with a set of hands they've implemented to take different hand poses in-game based on the context a custom script gives them (neutral, grabbing a large object, grabbing a handle, pinching a small object). They already have a custom script on grabbable objects that determines what hand pose to use on that object (i.e. A key tells their controller script to switch to the pinch pose when grabbed). They need all grabbed objects to snap to the correct location in the hands, without tedious tweaking of the snap transform. To do this:
Wrist attachment"Yet Another Parkour Game" (YAPG) and "Cooking Countdown" (CC) both want to add a wrist watch to one of the player's hands. YAPG uses the watch as a stopwatch while CC uses it to count down how much time the player has left and let them pause the game to a menu. Both games do other things with the controller model and CC sometime hides it without wanting to hide the watch. To do this they:
A new glove controller was recently released. YAPG's players are pestering them to support the gloves so they can put away one of the Vive wands and only use the other for interacting with the menu. YAPG has already added a tracked object alias for the gloves that is only enabled when the gloves are in use and implemented hand gestures for controlling the game's locomotion. But the stopwatch is still hovering around the controller instead of the player's hands. To fix this in YAPG:
Sometime later the gloves are pretty popular and CC has started seeing a selection of their players owning glove controllers. CC adds tracked controllers as well for hand presence around the controller and even an index finger collider that can touch menu items. However the watch controls their menu and they don't want it to automatically switch to the gloves right away.
Pointer attachmentThe "Office Meeting Simulation" game has replaced the normal controllers with a laser pointer for pointing at a screen. This pointer is already setup with its own hand relative mount point that is not the normal controller tip and may point in a different direction. To get UI/interaction pointers pointing the right way they:
|
You can do all of the above using the VRfree glove which even offers independent and mobile hand- and fingertracking for the GearVR and Daydream since it has its own 3D positioning and does not rely on stationary references. Check it out here: http://www.sensoryx.com/company/system_demos/ |
No rush, but I think a VRTK_WristAttachment or a left and right wrist alias object in the SDK Manager could be a useful feature. Watches are a useful interface that we may see more of in VR; they can tell in game time / level timers / or work as a menu interface. A few games have done this so far just with a watch on a hand model. But it would be good to have a point we can simply attach something to to have it follow the spot your wrist should be in.
In general it would probably be nice to have a standard set of attachment transform points available for controllers (eg: a grab transform for the relative location your hands grab around, to lock a pistol grip or sword handle to; a barrel transform for the location a gun barrel would naturally point and you'd expect a generic gun to shoot towards; a point transform for the direction a laser pointer would feel natural with the controller, for Vive wands that's usually upwards from the wand while on Touch you normally point with an index finger) since they vary by controller.
However I think it's worth special casing the wrist transform and giving it a dedicated script. It's more of a fixed attachment point rather than a point relative to grabbing. Also having a script dedicated to the wrist leaves us room to support any changes to how the wrist location is tracked. We've already seen tracked gloves that will be coming out. So it's possible that in the future the wrist and controller could be part of separate tracked objects. Also the gloves shown off seem to track how the wrist bends, so the location of the wrist relative to the controller is variable.
The text was updated successfully, but these errors were encountered: