-
Notifications
You must be signed in to change notification settings - Fork 503
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
On the robustness of the headtrackr #29
Comments
Hi, glad you like headtrackr! headtrackr relies a lot on facial colors to track the face, so to get good results, it's most important that light is even, and that there are no skin-colored objects in the background. I usually get best results if I face a window or another light source, so that there are no shadows on my face. Regarding precision, have you had a look at clmtrackr which I've also made? It has much more precise facial tracking than headtrackr, but might be slower on some systems. |
Yep I definetely did, but clm tracker is not even near to the tracking My game relies on my players moving their heads rapidly right and left and On the other hand - headtrackr keep up to speed just fine - but it loses Is there any possiblity of mixing the 2 algorithms? I am aware that On 15 July 2014 16:33, Audun Mathias Øygard [email protected]
Nicholas Kyriakides This message and its attachments are private and confidential. If you have The University of Westminster is a charity and a company |
Yeah, it might help to stop and start headtrackr at regular intervals ( via stop() and start() ) if it loses focus often. I don't really know any other algorithms that are as fast as camshift but more precise, unfortunately. |
I already do this, but I have no way of detecting automatically whether the Maybe a simple algorithm could be built using as input the events already I am no scientist nor too experienced to suggest something but I am trying I don't really think something other than camshift can be used - as you On 15 July 2014 16:51, Audun Mathias Øygard [email protected]
Nicholas Kyriakides This message and its attachments are private and confidential. If you have The University of Westminster is a charity and a company |
Side note - clmtracker seems to be using an old fork of numeric.js - which is now faster. |
I actually thought about looking at the dimensions of the box in order to detect when it fails, but in my experiments this would restart the tracking too often when it shouldn't. If you run it in the background, though, the slowdown/lag on detection of face might not be an issue, so it's certainly worth a try. Neon22 : where did you find an old version of numeric.js in clmtrackr? I thought I was using version 1.2.6 already, but might be I forgot to remove it somewhere. |
actually I tyhinkits just the link to sloisel rather than spiroz https://github.com/spirozh/numeric. sorry for misdirect |
In general there should be some telltale signs that the tracking has failed
Some time ago I remember seeing somewhere you telling that Hue/Saturation Is this still the case? On 16 July 2014 02:49, Neon22 [email protected] wrote:
Nicholas Kyriakides This message and its attachments are private and confidential. If you have The University of Westminster is a charity and a company |
The original camshift paper mentions using only hue and saturation to track the face, but in my experiments I found that this didn't work as well as just using RGB, so I ended up just using RGB information to track the face. So hue and saturation is not implemented. |
Very well, If I get some off time I'll try to test what happens with the web-workers Until then thanks a lot for the info and the lib of course. On 25 July 2014 19:09, Audun Mathias Øygard [email protected]
Nicholas Kyriakides This message and its attachments are private and confidential. If you have The University of Westminster is a charity and a company |
Hi,
I am trying to use the headtrackr for getting ONLY the X position of the head.
I am using it in a small game I've build which looks pretty nice this far.
At the moment I ask my user to present his full face so the headtrackr can lock on his head and then I ask my user to tilt the screen a bit so that only the chin and up is visible(I want to hide the neck since low-cut tops create color interference).
I would like to ask 2 things:
What is the best advice you think I should give to my users in order to have my game as robust/accurate/fast as possible. What calibration recommendations do you suggest I should present and what are the ''perfect'' conditions for the headtrackr to work at it's best?
The goal is to make the head-tracking as robust as possible between different lighting environments.
Also I need to have the head-position detection as predictable as humanly possible(some times the headtracker goes all nuts on me and starts swinging right and left, losing it's center).
At the moment I only advice that during the head tracking, the user should ensure that he has his both sides of his face evenly and brightly illuminated.
Also as a second calibration step , I advice my user to tilt his laptop screen up until the point where the neck is not in the frame.
Second thing:
Of course any advice for any parameters I might pass on starting the headtrackr are welcome. (Should I use facetracking x position or headtracking x position?, Should I calculate angles etc etc)
Thanks in advance man and thanks for the work
The text was updated successfully, but these errors were encountered: