Hidden Markov Models for Hand Gestures

I am completing a final year project for hand gesture recognition using Hidden Markov Models

I have a fair understanding of Hidden Markov Models and how they work using simple examples such as the Unfair Casino and some Weather examples.

I am looking to implement multiple Hidden markov models where each model corresponds to a single gesture, similarly to this paper where the observed states are the angles between the coordinates of different points. This would create a sequence of numbers from 0 to 18 as seen in Figure 3 and Figure 4. .

What would the hidden states be in terms of this scenario?

The weather example has the observations ‘Walk’, ‘Shop’ and ‘Clean’ which would be the numbers 0-18 in the hand gesture case, however I do not know what the states ‘Rainy’ and ‘Sunny’ would correspond to in the hand gesture scenario.

Can a polymorphed creature use class features that don’t require speech or hand gestures?

The Polymorph spell states (in part):

The target’s game statistics, including mental ability scores, are replaced by the statistics of the chosen beast. It retains its alignment and personality.

The creature is limited in actions it can perform by the nature of its new form, and it can’t speak, cast spells, or take any other action that requires hands, or speech.

So, can a polymorphed Paladin use their Divine Smite class feature? It expends a spell slot, but it is not a spell, and as far as we can tell does not require hands or speech.

We know (based on the 10th level Transmutation Wizard feature) that a wizard is supposed to be able to polymorph herself, and we can assume that she is also supposed to be able to maintain concentration needed to stay in the new form (otherwise it would be a pretty useless ability). This is (as far as I can tell) not something any of the beasts eligible as polymorph targets can normally do. So it is clear that the polymorphed creature maintains some abilities from her old form. The question is which ones.

The Druid’s Wild Shape ability explicitly says you can use class features, while Polymorph does not, suggesting that you can’t use class features while polymorphed. However, Wild Shape also explicitly says that you can continue to concentrate on a spell (see below, emphasis mine), and Polymorph does not, yet we believe that a polymorphed creature can continue to concentrate on the polymorph spell. So the fact that Wild Shape explicitly allows something and Polymorph is not explicit about it does not necessarily mean that the polymorphed creature cannot do those things.

From Wild Shape rules:

You can’t cast spells, and your ability to speak or take any action that requires hands is limited to the capabilities of your beast form. Transforming doesn’t break your concentration on a spell you’ve already cast, however, or prevent you from taking actions that are part of a spell, such as call lightning, that you’ve already cast.

So, since Polymorph is silent on the question of whether or not class abilities can be used, and states only that the polymorphed creature cannot perform actions that require hands or speech, how can we best determine whether (RAI) polymorphed creatures maintain any of their old abilities beyond spell concentration?

Mouse Gestures Aren’t Working on Ubuntu 18.04

I have a Thinkpad T480 and I am running regular Ubuntu 18.04. Sometimes, the trackpad gestures stop working, and I have to restart my computer to fix them. When I say trackpad gestures, I mean two-finger right click and scrolling. I also have tap-to-click disabled, but even when I enable it, it doesn’t work.

Thanks in advance for the help.

Dell XPS 13 9370 Touchpad Multi-touch / gestures 19.04

OK so i’ve done some digging and found a lot of references to a synaptics driver etc etc – i don’t think that is my issue since all the clicks are working just fine and the movement is very smooth — i just want to enable three finger swipes to change desktop environments because i’m too lazy to use the keyboard shortcuts – i am sure its doable and i’ve read that ‘it works out of the box’ in 18.04 – but i’d rather not go from 19.04 to 18.04.

my machine is dual booted with windows and everything in ubuntu is working very slick

xinput shows two ‘mice’

⎜ ↳ DELL07E6:00 06CB:76AF Touchpad id=13

⎜ ↳ PS/2 Generic Mouse

i installed the xorg-synaptics driver solution i found on a separate thread – no change – i poked around in tweaks tool and the settings and i’m just not seeing where to enable three finger swipes.

Swipe between Spaces/full-screen apps: Uniform gestures for Multi-Touch Trackpad and Magic Mouse

I often run macOS apps in system provided full-screen mode. I switch between using an Apple Magic Mouse and MacBook’s Multi-Touch Trackpad.

To switch between Spaces/full-screen apps, System Preferences allow me to use 3 or 4 finger swiping on the Multi-Touch Trackpad, but only 2 finger swipe on the Magic Mouse.

I would like to have the same gesture available for both the devices. I don’t want to think about what tool I’m using. To me it should just be the same.

Is there a way to do this using BetterTouchTool? I haven’t been able to figure out the correct action?

Gestures in Android 9 and what it means for Android apps backward compatibility

I heard about the gestures feature in Android 9, and that it substitutes in some way the ordinary navigation.

Is this feature optional in general on Android 9 smartphones? (ie is it off by default? Or is it on by default, but can be switched off?)

Will Android 8 and lower apps simply not allow normal navigation on Android 9 phones with gestures enabled? (assuming that in general these can be disabled)

What config file handles touchscreen gestures?

I have a Wacom companion 2. Gestures like pinch to zoom, two finger drag to pan and the like do not work in Krita. I also have Microsoft Surface pro 3, which has a modded kernel and other files by Jakeday. Gestures to pan and zoom DO WORK with the surface pro 3 and Krita. So I’m curious if touch screen gestures are configured in a specific file that I could copy and place into my Wacom Companion system to get gestures working properly. Or am I barking up the wrong tree?

Thanks

How to configure stock Ubuntu Wayland gestures (18.10)

I’m running Ubuntu 18.10 on my HP ZBook Studio G3. When running the default DE on Wayland (by clicking the Ubuntu Wayland option on the login screen), I get the following touchpad gestrures by default:

-two finger scrolling

-four finger swiping between desktops

-three finger pinching to open ‘Activities’

However, only the two finger scrolling gesture can be configured in the touchpad settings. Does anyone know where/how to configure the other two gestures?

Can a mage in Body of Air cast a spell that doesn’t require gestures?

A mage PC in the play-by-post Dungeon Fantasy campaign I’m playing in cast Body of Air on herself, in order to rise above a brewing combat. She then tried to cast Stone Missile, but the GM commented that she couldn’t do that because Body of Air gives the “No Fine Manipulators” trait while in effect (hands are made of, essentially, air currents). He said she wouldn’t have trouble throwing the missile, because that’s an “innate attack” (additionally, the missile of a missile spell isn’t material according to multiple game designer clarifications I’ve read), but that she couldn’t cast the spell.

By standard rules, with skill 15 or higher, a PC caster should need to speak one or two soft words, or make a quick gesture in order to cast a spell (except for spells that specify some special movement or incantation requirement), and it’s been my understanding that the PC in question has base skill 15 or higher (not my PC, so I’m not certain, but the OOC discussion reinforces this belief).

Is there a rule I’m missing that would prevent a PC under the effect of Body of Air from casting a spell that needs only a couple words?

How to imitate gestures in developer mode?

I get the following warning when I turn on USB debugging :- “Developers can use special software to imitate gestures, change settings, and grant permissions via USB.”

Can someone tell me how exactly can I imitate gestures using some special software on my PC, and if possible how do I trigger a touch input to a part of the android device whenever a volume button or fingerprint scanner is touched.