Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Wrist attachment and standard controller mount points #876

Open
dantman opened this issue Jan 25, 2017 · 19 comments
Open

Wrist attachment and standard controller mount points #876

dantman opened this issue Jan 25, 2017 · 19 comments

Comments

@dantman
Copy link

dantman commented Jan 25, 2017

No rush, but I think a VRTK_WristAttachment or a left and right wrist alias object in the SDK Manager could be a useful feature. Watches are a useful interface that we may see more of in VR; they can tell in game time / level timers / or work as a menu interface. A few games have done this so far just with a watch on a hand model. But it would be good to have a point we can simply attach something to to have it follow the spot your wrist should be in.

In general it would probably be nice to have a standard set of attachment transform points available for controllers (eg: a grab transform for the relative location your hands grab around, to lock a pistol grip or sword handle to; a barrel transform for the location a gun barrel would naturally point and you'd expect a generic gun to shoot towards; a point transform for the direction a laser pointer would feel natural with the controller, for Vive wands that's usually upwards from the wand while on Touch you normally point with an index finger) since they vary by controller.

However I think it's worth special casing the wrist transform and giving it a dedicated script. It's more of a fixed attachment point rather than a point relative to grabbing. Also having a script dedicated to the wrist leaves us room to support any changes to how the wrist location is tracked. We've already seen tracked gloves that will be coming out. So it's possible that in the future the wrist and controller could be part of separate tracked objects. Also the gloves shown off seem to track how the wrist bends, so the location of the wrist relative to the controller is variable.

@thestonefox
Copy link
Member

The SDK Manager is really for identifying core aspects of the VR setup.

A wrist attachment point is not part of a core set up. Also it's easy enough to do by just attaching something as a child to the controller script alias (the same way the tooltips are attached to the controller alias).

As for the standard set of attachments, this is done via the "Interactable Object" snap handles and cannot be standardised because each object would potentially have a different snap point. No 2 pistols of different makes are held in the same position.

@dantman
Copy link
Author

dantman commented Jan 27, 2017

The SDK Manager might not be the best place, I am thinking of something higher level. Though parenting the wrist to the controller may not be the best place either.


I understand you can attach something wrist relative with a transform. However there are two common things / reasons I think it would be a good VRTK feature.

Firstly, it would be nice for that transform to be standardized and easy to drop in, instead of every game with a watch/wristband/etc... having to re-discover or copy and paste that transform.

Secondly, as I mentioned the wrist as a per-game hardcoded controller-relative transform is not very future-proof. There are a number of future situations where that won't work correctly.

  • If the controller is a glove and the controller's origin is relative to the wrist, then the transform is [0,0,0] and a Wand/Touch relative transform will be incorrect.
  • If the controller is a glove and the controller's origin is relative to the hand, then the wrist will not be a fixed transform but vary based on the angle from the forearm sensor.
  • If any new controller (Valve's knuckles controller or a new accessory) includes a forearm sensor, then the wrist location definitely won't be a fixed transform.
  • If a glove + controller combination is used (ie: a left hand wearing a glove accessory holds on to a controller) then the wrists end up relative to a completely different tracked object than the left/right controllers.

We already know that glove accessories will be coming out soon. We can't write code to handle them yet. But we can at least prepare by making sure that when they come out handling the wrist is a simple case of "update VRTK and tweak an attribute or two on the wrist attachment script to suit your game" instead of "refactor your game so wrist locations aren't hardcoded, parented to the wrong object, etc...".


On the topic of standard attachment points, I wasn't referring to the attachment points on grabbable objects. I'm referring to vectors/transforms for standard spots on the controller; which may be useful for scripts or making game elements follow. Not all of the points are grab related. And they aren't the same across platforms.

I've drawn up some diagrams to try and illustrate what I am thinking of.

  • (yellow) is a grab/grip transform, ie: the center of the position on the controller your hand grabs around aligned with the axis that a straight cylindrical object would point if you were holding it.
  • (cyan) is the point transform/vector, ie: the point and direction from which pointers are emitted. This differs between platforms; on the Vive wands the pointer normally emits from the end of the wands and points along the same axis as you grip (yellow), on the Touch controllers the pointer normally emits aligned to the forward vector from where your index finger should be.
  • (red) is a forward vector, ie: the direction your fingers point when you stretch them out and if you were to hold a generic sci-fi gun you'd expect it to fire.
  • As you can see in in the final photo the axis you grip with (yellow) is not perpendicular to the forward vector (red) of your hand, so getting one from the other is not a simple 90˚ rotation.

Looking at the OpenVR source, it appears as if some of these transforms are even standard coordinate system components available in SteamVR. However I only see VRTK making use of the tip coordinate system.
https://github.com/ValveSoftware/openvr/blob/master/headers/openvr.h#L2987-L2994
https://github.com/ValveSoftware/openvr/blob/master/headers/openvr.h#L2850-L2854

vive-points
oculus-points
hand-points

@bddckr
Copy link
Collaborator

bddckr commented Jan 27, 2017

I think it would be awesome for VRTK to support all these "Grab attach points on the controller". Given that the Oculus SDK probably doesn't provide something like that by default we'd have to introduce our own abstract attach points.

@dantman
Copy link
Author

dantman commented Jan 27, 2017

@bddckr Yeah, probably. On the positive side though, Oculus via the SDK only has one type of controller and doesn't support 3rd party controllers because every new tracked object has to be designed and approved by Oculus; so we can just hardcode the transforms in the VRTK/SDK/OculusVR code.

Even better, SteamVR may have already done some of the work. If OpenVR outputs those coordinate system components when you plug a Rift+Touch into it then all we have to do is double check the controller origin is the same in OculusVR and SteamVR, then copy and paste the transform values into the OculusVR code.

@cameronoltmann
Copy link
Contributor

This strikes me as something that would be best in two parts. A generic part that belongs in the toolkit and lets users set up arbitrary mount points. And an example scene or two that make use of these to implement things like watches, swords, and wands.

I'd be wary of assigning any standard mount points for things like gun barrels and watches as part of the core toolkit. Even the most genetic seeming thing often turns out to be very app specific. Look at the forums for a lot of the shooting games and you'll see people can't agree on the optimal angle between the barrel and the handset, for example. That's why I think the best bet is to include examples rather than building them into core.

@dantman
Copy link
Author

dantman commented Jan 29, 2017

@cameronoltmann Being able to setup arbitrary mount points to connect things to would be a nice extra. The key thing though is that the mount points you create shouldn't be limited to being relative to the origin of the controller, they should be relative to known locations on the controller (handgrip, tip, base, forward vector, button components, ...) which may differ by controller type.

I still think the wrist use case needs more special handling than just a controller relative mount point. Especially given the whole wrists on a separate tracked object issue. However, I can compromise on wrists using mount points for now if:

  • It's very easy for me to insert a handgrip relative mount point on a controller, do the same on another tracked object, and it only takes a simple line of code to make an object attached to one mount point switch to another.
  • It's possible to create scripted mount points, ie: mount points where the relative transform is not fixed but can be updated by a script. So that if something comes along, like forearm tracking, it's very easy to write a script that will simply drop in on a mount point to make it follow the new tracking data.

Then the mount points are flexible enough that a "Wrist" point can be attached to a mount point; and in the future if any new controller features come out it's easy to just attach that wrist to a mount point on another controller or make the mount point itself dynamic.

This level of flexibility will probably be useful for some other type of feature a game might want to implement; though I can't immediately think of an idea off the top of my head.

@cameronoltmann
Copy link
Contributor

The example scenes would include things like tip/grip/wrist for different controllers. The core toolkit would just need the raw functionality to enable those scenes, which is to say, the ability to define mount points with arbitrary offsets from the controller tip. With that functionality, it's trivial to create mount points that match any position or move in any way you want. And since most of those ways will be application specific, I don't think they belong in the core toolkit, but in the examples.

I also think you're unnecessarily complicating the wrist issue. If the wrist is a separate tracked object, it's really got nothing to do with the main controller, and thus nothing to do with the core toolkit. Additionally, since there are no standards beyond headset plus two controllers at this point, there's probably not much point in trying to architect the toolkit for things that don't exist.

@dantman
Copy link
Author

dantman commented Jan 30, 2017

@cameronoltmann tip/handgrip/base are not simple offsets, you can't implement them that way; they are different for each type of controller. If you create a transform for a 'tip' based on the location of the tip on a Vive wand, then the tip will not be at the tip when you use Oculus Touch or Valve's knuckles controllers. And OpenVR understands this concept, defining separate coordinate systems for controllers (see https://github.com/ValveSoftware/openvr/blob/master/headers/openvr.h#L2991).

@cameronoltmann
Copy link
Contributor

They are simple offsets. They just happen to be different for each controller type. And as new controller types are released, there will continue to be new sets of offsets that will make sense to have. And if you have example scenes for the different controller types, people will have good reference material to use when adapting to a new controller.

The reason I don't think it makes sense to bake these points directly into the toolkit, but rather to have example scenes, can be summed up by this question:

Which of the "standard" offset vectors is appropriate on all controller types to line a bow up with?

On the Vive, this is generally along the shaft of the controller. On the Oculus Touch controllers, none of the standard vectors is appropriate. And new controllers will introduce new configurations we haven't thought of yet. So baking in a predefined set of points will only serve to artificially limit a medium that is still made of wide open space and room for exploration. It would be a mistake, in my opinion.

@dantman
Copy link
Author

dantman commented Jan 30, 2017

@cameronoltmann Please, please, look at the OpenVR code I am linking to. tip/handgrip/base are not offsets you hardcode, in SteamVR they are offsets you ask OpenVR for the location of. When the controller is a Vive wand it'll give you one value, when the controller is a Touch it'll give you a different value, and when the knuckles controllers are released you'll get another value without needing to update your game. (On that topic, games should not need to be updated every time a new 3rd party controller with the standard control-set comes out, that is the antithesis of how OpenVR is supposed to work and results in a piles of old games that'll never get updated not working correctly with new controllers).


On the topic of bows there are two answers to that:

  • A) If your game is designed with tools where all interaction is done with an "interaction sphere" at the tip, then the mount point you attach the bow and bow string to is relative to the tip. (From the looks of the use of "tip" in the SDK code, this is VRTK's current behaviour), otherwise
  • B.1) The mount point you attach the bow to will be relative to handgrip, so that the location you hold a bow with will be placed relative to where your hand is physically grabbing.
  • B.2) Instead of the handgrip where your full hand is, the mount point the bow string attaches to can be relative to the location of the trigger, then the bow string is held near your index finger which you normally grab a bowstring with.

And the answer to "What 'standard' offset vectors would VRTK let you create mount points relative to?" would be "Every one at least one supported SDK provides". If a transform is not available through SteamVR for a controller, it'll fall back to the origin or perhaps the closest known point. If the Oculus SDK doesn't provide an offset, we'll hardcode the one SteamVR outputs for the Touch controllers (though from the look of the OculusVR folder code, most of these points may already be provided by the SDK).

The offsets currently supplied by SteamVR are:

  • origin: Not entirely defined where this is
  • tip: An unambiguous tip, used for laser pointing
  • base: For controllers with a defined base to it
  • handgrip: A neutral ambidextrous hand pose, on a plane relative to the neutrally posed thumb and index finger.
  • status: OpenVR calls this a "1:1 aspect ratio status area, with canonical [0,1] uv mapping", I'm not entirely certain what it actually maps to, but I'm guessing that this might be the where SteamVR displays things like the battery level indicator in the dashboard.
  • all controller components: All the controller components SteamVR knows about (trigger, grip buttons, touchpad/joystick, application button, system button, etc) also have a defined component from which you can ask OpenVR where their location is.

@cameronoltmann
Copy link
Contributor

Yeah, I see the set of predefined points. I see those as sort of a lowest common denominator.

Can you define mount criteria (position and orientation) that will work for all controllers for bow orientation? Sure, it makes sense for the body of the bow to go in the middle of your hand. But which way does it face?

@dantman
Copy link
Author

dantman commented Jan 30, 2017

@cameronoltmann The coordinate information given by OpenVR should be a matrix, which the Unity plugin should be turning into a transform. So each coordinate system/component position OpenVR exposes also comes with an orientation.

I can't examine the SteamVR's output right now, so exactly what those orientations are is just theory at the moment:

  • tip: Presumably since the tip is intended for laser pointing the transform will be oriented so that whatever axis OpenVR considers 'forward' is pointing in the direction a laser/ray should be cast.
  • base: Presumably since this is for controllers with some sort of base the orientation would be something aligned with the flat plane of the base, so you know how it naturally rests on objects.
  • status: Since this seems intended for a canvas to attach to, this is probably oriented the way a canvas should be oriented
  • handgrip: It's hard to say which axis the handgrip will be oriented to, however given the statement "plane relative to the neutrally posed thumb and index finger" the orientation is probably relative to the forward direction of your hand rather than the axis you grip on (look at my hand diagram to see the difference).

For a bow, since OpenVR states that handgrip is relative to the plane where the index and thumb are placed: One option would be for the game developer to create a transformed mount point that is offset from the index/thumb so it instead sits in the center of your hand and set the origin/snap on their bows in the center point of the handle and face them forward.

Alternatively if you wanted to work more with cylindrical objects like sword grips. You could instead define a mount point relative to the handgrip but rotate it a bit so it roughly fits your grip axis instead. Then you could define the snap point on your objects so they are oriented relative to whatever semi-cylindrical axis you grab the object with.

@cameronoltmann
Copy link
Contributor

My point is that you'll need to offset the built in points by a different amount on different controllers, regardless of what you use as a base. Hence the utility of having more than the built in points as examples. Because different controllers result in different grip poses, which result in different mounting practices being appropriate for some actions. The will be no one-size-fits-all solution.

@dantman
Copy link
Author

dantman commented Jan 30, 2017

@cameronoltmann If you had to vary your offsets on different controllers that would defeat the point of OpenVR having these points in the first place.

The handgrip is defined by OpenVR as being a "neutral hand pose, ambidextrous, between the index and thumb". Each controller driver defines where this is on the controller. And then OpenVR tells us where this is.

So it doesn't matter what controller you use. Whether you use the Vive wands, Oculus Touch, Valve's knuckles controller, or a 3rd party controller that implements a SteamVR driver; the handgrip component will always tell you where the plane the index finger+thumb sits on and how it is oriented.

From there the transform from your index+thumb plane to the center of your hand is the same, it doesn't matter what controller you're holding.

Different examples for mounting strategies would be good, but those mount points you create are still the same transform whatever controller you use.

@dantman dantman changed the title Wrist attachment point Wrist attachment point and standard mount points Jan 30, 2017
@dantman dantman changed the title Wrist attachment point and standard mount points Wrist attachment and standard controller mount points Jan 30, 2017
@cameronoltmann
Copy link
Contributor

What I'm saying is that's just not sufficient for some cases. It's a good start but it doesn't give you everything you need for all cases. There will be exceptions. And there will be SDKs that are missing even some of those basic pieces.

Anyway, thanks for the links to the OpenVR code. I learned some stuff.

@dantman
Copy link
Author

dantman commented Jan 30, 2017

Well if someone does have a special case that's impossible to do with a standard coordinate component and fixed transform mount point. That suggestion I made for scriptable mount points would allow them to override the mount point transform with some platform or controller specific tweaks.

As for SDKs that's probably not an important issue. Oculus only supports one type of controller which can be hardcoded. Though looking at the OculusVR code I'm surprised at how many similar standard mount points it already handles. OSVR's documentation goes from confusing to nonexistent, but from what I can gather it looks as if they may already have something similar to the handgrip position.

We could always have a way to supply per-sdk overrides for a mount point's transform (say an override script you drop in on a mount point). In a rare case allowing you to define a mount point with a transform that's not per-controller but per-sdk, taking SDK differences into account by allowing you to define multiple transforms that are relative to SDK specific components.

In any case it's probably not a big problem, VRTK is already relying on SDKs providing mount points like these to find the tip, buttons, etc.

@bddckr
Copy link
Collaborator

bddckr commented Jan 30, 2017

I can imagine having different mount points supplied per SDK and being able to offset those to customize some more.

There's an issue that is about changing the way the buttons and axes are mapped to from the various SDKs. The idea is to move the whole per-SDK logic into the SDKs because it already shows that the current way of abstracting to a shared set of buttons and axes has its problems. There is no "touched grip" on the Vive for example. See #842 for more info. Work has already started to dynamically discover installed and VRTK-supported SDKs via #867.

The same could be done for mount points. The customization point would be per-Object then.

@dantman
Copy link
Author

dantman commented Feb 8, 2017

To summarize, so far what we've discussed would probably look like this:

  • GetControllerElementPath or something similar should be updated to support per-SDK components/coordinate systems
    • Each SDK should be updated to support all the available coordinate systems and if possible, components (for Steam VR which has dynamic components this means at minimum adding the handgrip, base, and status coordinate systems; ideally also adding some of the components available on different controllers)
  • Add ability to create mount points by adding a script to a game object that is a child of a controller alias
  • For each available SDK a mount point should be able to define a component/coordinate system it should be relative to and a transform relative to it
  • Consideration should be made to ensure that scripts on the same object are able to modify and reparent the mount point afterwards (to either make it dynamic or mount it somewhere else)
    • Don't set transform each Update, set it once when ready
    • As reparenting/initialization of controllers in some SDKs happens after a delay of some number of frames; provide an event for scripts to listen to to allow them to wait until it's safe to override the transform
    • Make sure script behaves correctly on dynamic enable/disable so a script may disable the mount point behaviour and reparent the object to another controller alias (eg: a 3rd party tracked object)
  • Add ability for object grab snap points to be relative to a mount point on a controller instead of the ControllerElements.AttachPoint (SteamVR: tip, OculusVR: origin?).
  • Likewise add ability for pointers to be relative to a mount point.
  • Non-grabbable objects (e.g. Watches) can be attached to mount points by parenting the object to the mount point.

Glossary

  • SDK component/coordinate system: A "coordinate system" provided by a SDK (like the tip, handgrip, base, and status matrixes provided by Steam VR) or "component" with its own transform also provided by the SDK (trigger, trackpad/joystick, etc...)
  • mount point: A user created game object under a controller alias that is made relative to a SDK component/coordinate system to allow various
  • snap point: The existing snap point VRTK has for grabbable objects (snap handle, origin/handle transform, snap type, etc...)

Problems

  • Steam VR's components are dynamic, so not only can components differ from SDK to SDK, Steam VR's available components can differ from attached controller to controller.
    • How do we support identifying and mounting to more than just the base coordinate systems.
    • Some of these components and/or coordinate systems may not be available from controller to controller within Steam VR.
      • Idea: Instead of setting per-SDK mount point relative transforms use a filter/rule based setup where you: Select a SDK, select a relative point, and define a transform. At startup VRTK goes through the list and uses the first one that is valid. i.e. You can define multiple component-relative transforms for the same SDK and the first component that is actually available is what gets used.
  • Mount points will be game objects with multiple possible transforms, what is the best way to allow the transforms to be edited without tedious tweaking of the vector and quaternion values?
  • To my knowledge, grabbable objects and other VRTK components aside from the SDK manager currently do not need a handle to the alias objects while editing. Technically meaning that the objects in [VRTK] don't even need to be in the scene while editing. We probably want to retain this restriction because it allows for VRTK to be put into a prefab inserted dynamically by a loader and also ensures that the Persist On Load/DontDestroyOnLoad option available on the SDK manager works correctly across scenes. However, in that case, how will grabbable objects know what mount points are available so you can select one in the editor.
  • Right now we have 2 controller alias objects. But aside from a Left/Right/Both Allowed Grab Controller option and a left-handed/right-handed snap point (which you only actually have to specify one for to get an object that snaps to the same place in both controllers), grabbable objects do not care which controller is which. How do we handle the fact that while normally you'd create the same set of mount points on both controllers the mount points are on different alias objects and you technically have two sets of them.
    • The simplest answer to this would probably be both a left and right mount point option on the grab attach script. Although that might not be the best for the editor interface, given that for "Right Snap Handle" and "Left Snap Handle" you only have to specify one and the other will inherit it, but for the mount points you do have to specify both of them.

User stories

Multiple mount points for object grab

The "Some Random Room Escape" game has replaced the normal controllers in their game with a set of hands they've implemented to take different hand poses in-game based on the context a custom script gives them (neutral, grabbing a large object, grabbing a handle, pinching a small object). They already have a custom script on grabbable objects that determines what hand pose to use on that object (i.e. A key tells their controller script to switch to the pinch pose when grabbed). They need all grabbed objects to snap to the correct location in the hands, without tedious tweaking of the snap transform.

To do this:

  • On each controller alias they create a "Hand" mount point. Their interactive hand models are parented to this mount point. On SteamVR this mount point is made relative to the "handgrip" coordinates. On OculusVR it is made relative to the "Base" component.
  • For the pinch pose:
    • They create a "Pinch" mount point. This mount point is given the same SDK relative settings as the "Hand" mount point; however the transform is instead changed so the position is in between the fingertips when the hand model is in the pinch pose and the rotation is set so forward points in the direction an object like a key would point when you pinch it.
    • In their pinchable objects such as the key, the left and right mount point attachments are set to the "Pinch" mount point.
  • Similar mount points are created for the other hand poses. The snap handles for objects with grabbable handles being placed in the center of the handle and the mount point in the respective location on the mount point.

Wrist attachment

"Yet Another Parkour Game" (YAPG) and "Cooking Countdown" (CC) both want to add a wrist watch to one of the player's hands. YAPG uses the watch as a stopwatch while CC uses it to count down how much time the player has left and let them pause the game to a menu. Both games do other things with the controller model and CC sometime hides it without wanting to hide the watch.

To do this they:

  • Create a "Wrist" mount point.
  • On SteamVR they set the mount point relative to handgrip.
  • On OculusVR they set the mount point relative to Base.
  • For both of them they move around the transform until it sits somewhere around the center of the player's wrist when they hold the controller.
  • The watch game object has an origin in the middle of the wrist and is parented to the left controller alias' "Wrist" mount point.

A new glove controller was recently released. YAPG's players are pestering them to support the gloves so they can put away one of the Vive wands and only use the other for interacting with the menu. YAPG has already added a tracked object alias for the gloves that is only enabled when the gloves are in use and implemented hand gestures for controlling the game's locomotion. But the stopwatch is still hovering around the controller instead of the player's hands.

To fix this in YAPG:

  • They add a "Wrist" mount point to the glove tracked object.
  • They already have a script on their glove alias objects which when initialized detects if glove controllers are paired and disables the glove object otherwise.
  • The script is modified so that when initialized and enabled it finds the "Wrist" mount point on the controller alias for the same hand and reparents any children onto the tracked glove's "Wrist" mount point.

Sometime later the gloves are pretty popular and CC has started seeing a selection of their players owning glove controllers. CC adds tracked controllers as well for hand presence around the controller and even an index finger collider that can touch menu items. However the watch controls their menu and they don't want it to automatically switch to the gloves right away.

  • They add a new "VRTK_GloveGrabbedController" script to the glove tracked objects, this script uses a bend threshold for the middle and ring fingers to detect when they seem to be wrapped around an object and a box collider inside the area the hand grabs set to collide with controller object, and then emits an event when it thinks a controller has been grabbed by the hand.
  • When the event fires CC uses a custom script to disable the "Wrist" mount point on that controller and reparent it to the glove.

Pointer attachment

The "Office Meeting Simulation" game has replaced the normal controllers with a laser pointer for pointing at a screen. This pointer is already setup with its own hand relative mount point that is not the normal controller tip and may point in a different direction.

To get UI/interaction pointers pointing the right way they:

  • Create a "Pointer" mount point.
  • The SDK relative origin is copied from the mount point they use for the laser pointer object.
  • The mount point transform is moved so it sits where the laser pointer emits light.
  • In the pointer script the Pointer Origin Transform is set to the "Pointer" mount point object.

@VRactive
Copy link

You can do all of the above using the VRfree glove which even offers independent and mobile hand- and fingertracking for the GearVR and Daydream since it has its own 3D positioning and does not rely on stationary references. Check it out here: http://www.sensoryx.com/company/system_demos/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants