-
-
Notifications
You must be signed in to change notification settings - Fork 317
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Touchscreen doesn't work with emscripten build #446
Comments
Update: It should be a problem from emscripten, it partially works when I use two fingers, but the behavior is inconsistent. I will find a workaround for that. |
Could this issue be re-opened? Or could you explain how you fixed it? I have observed a similar issue myself. On Firefox on Android, I was noticing something similar - on my own project, I was only able to click on GUI elements (like a button) by dragging my finger from off the button to onto the button and releasing it there, and then it would activate. If I just tapped the button, it would not work. Using two fingers as suggested here also seems to work - I touch the button with one finger, then touch another part of the screen with another that I move around a little and then lift off the screen, and then I remove the first finger from the button and it works. However, I have been completely unable to interact with the rGuiStyler on itch.io on my phone - the GUI seems to do nothing, one or two fingers, starting the touch on or off the buttons, nothing seems to work. Further investigation: On my own build of the core_input_mouse raylib example that shows mouse release events as well (by changing the size of the circle), the click event is happening. But if the touch is not moved from the initial location (i.e. just tapping the screen) the mouse remains where it was previously and is not moved to the location of the touch. Unless a second finger is added and moved around, in which case the mouse is moved to the position of the first touch (and it becomes a right click - possibly the left button down is remains down as well, but I can't check with the example as written). There is also an issue with the mouse position in raylib being offset upwards from the actual mouse position by approximately the header-height (the part above the canvas), but that may be from using the 5.5 example code with a 5.0 raylib build, and it doesn't happen on my other project that uses its own html shell file. In any case, this may be an upstream issue with raylib and not specific to rgui. |
The problem is from the emscriptem itself because of a weird way to handle touch control, and it doesn't seem to be fixable in raygui alone: I believe this should be the board, but unsure Since I am not alone to encounter this problem, let me open it again. PS: emscripten build works rather funky, where 1 finger doesn't have any effect until I drag clicking with two fingers, but when I build it with a native exe file, the touch screen works fine, as far as I remembered. |
I have build a simple emscripten project using raygui and zig, but I have face a problem on touch screen.
Initially, when I compile it into an executable for windows, despite a bit difficult to use, the touch controls works with my zenbook touch screen; however, when I have compiled it as a emscripten webassembly project, my touch screen no longer responsible to the application.
As far as I know, all the raygui in the examples from the official raylib website doesn't work either. For that reason, I wish to know:
Is this an intended behavior?
If so, any workarounds I can use to make the touchscreen responsible for the ui components?
The text was updated successfully, but these errors were encountered: