Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Although I am not sure, Steam Deck currently seems to emulate mouse with its touch screen, instead of sending touch inputs (at least that is what I experienced from desktop mode and some games). I would like to have an option to switch between actual touch input and mouse emulation. Some games may be built with touch input support.
I guess the SteamOS does not support touch input yet. I think it would be good if SteamOS had an touch input implementation where it takes into consideration the area of the finger, multiple touches and gestures, like Android and other mobile operating systems do.
For example, elite dangerous has literally tonnes of commands that can be bound to keys, the deck could display docking buttons, undercarriage , docking request etc. the list goes on and on.
We could have community built settings for MMOs that utilise part of the screen, or the whole screen in mirror mode. This could elevate the deck to a whole new dimension in gaming.
I just got my Steam Deck yesterday and trying to use its touch screen to play OSU!, and in Game Mode I can't get it to work at all. While in desktop mode I can sort of get it to work, it seems like when using the touch screen to play, when I tap on the screen, OSU! will register the "click" first, which click at where my cursor was. Then it will move the cursor to where I actually tapped (which is the position I want to click). But since it already registered my "click" at the previous position, I completely miss the note.
Still can't figure out how to get it to work yet
It also fails to work right with Okami HD, but that's probably not something Valve can fix.
Just give us some options in steam input, ill even settle for just a toggle to disable mouse mode.
X11 limitation. All touch input emulates a mouse. The wayland political situation is settling because that particular corporation caved under peer pressure and their followers are refuted. These issues will be fixed, but it will take years because these changes are low leveled.
https://blog.martin-graesslin.com/blog/tag/touch/
https://www.mail-archive.com/xorg@lists.x.org/msg06622.html
https://blog.martin-graesslin.com/blog/2017/02/how-input-works-touch-input/
For Okami HD, you can speed up native touch support by testing wine wayland. Wine wayland is essentially WINE without the Xwayland middle layer.
https://www.gamingonlinux.com/2021/06/collabora-continue-enhancing-wine-for-wayland-now-with-vulkan-and-multi-monitor-support/
https://www.winehq.org/pipermail/wine-devel/2021-June/188412.html
https://www.winehq.org/pipermail/wine-devel/2021-December/203035.html
https://gitlab.collabora.com/alf/wine/
Not sure why retroarch failed unless they don’t support wayland, and in that case why lie about touch support?
Can you confirm native windows touch apps work fine with wine wayland?
Should also mention okami HD does NOT have touch support on PC, however I don’t see why it fails if it works fine with a mouse.
Does this resolve the drag delay on the touch screen?
I think you misunderstand the problem. X11 is a display stack and events must emulate pointer events. X11 has been the defacto display stack on Linux for 20 years and the original assumptions do not apply anymore. Windows have dwm and OSX has Quartz.
Steam grabs raw input from /dev/input/* and translate buttons etc. As a side effect, steam controller does not register in `sudo libinput debug-events`. Steam client is processing touch events but the X server screws it up when you register it to a window.
For X11 applications to be touch aware, the application must bypass a good chunk of X handling and handle input itself before it needs to register a mouse to the window itself.
https://github.com/libsdl-org/SDL/blob/b599205d0ca7d8bcbed47579caa1e0cebc7009a7/src/video/x11/SDL_x11xinput2.c#L335
https://github.com/citra-emu/citra/commit/4ee914c7a8c405775a4e2e3d48a398bd7aa5113e
Wayland solves the problem by not specifying any particular input and allowing other projects like libinput to process input events.
I probably will not test wine wayland for awhile because I want to wait until it becomes packaged into wine-staging. I tried building WINE and I failed so many times.
Yes and no. Wayland transition is a pre-requisite to resolve these random delays on the Linux display stack. The Steam deck position itself when the whole ecosystem barely became stable and usable. Zoom added pipewire screen sharing less than a year ago. In wayland, each frame and event are timestamped such that you can hook up tools and measure them. Ideally, you poll input as close to the drawing near the vblank as possible. These fixes are kinda ecosystem changes. It will take years. I would not be surprise these problems will end up being fixed after steam deck 2 is released.
https://ppaalanen.blogspot.com/2015/02/weston-repaint-scheduling.html
Valve can do something like push for input latency tooling like how they push for gamescope to solve compositor specific problems.
tl;dr I hope I am correct because I am not a dev in these areas. These changes will take a long time. You might think it is a simple feature but the good solution requires a lot of infrastructure work. I learn most of these issues from random wayland vs x11 flamewars.... X11 side is kinda lacking.....
It's weird tho cuz I only feel the delay when dragging, no delay when tapping.