Discussion:
[Interest] Akward touch behaviour on Windows with MouseArea and MultipointTouchArea
Nuno Santos
2018-11-13 16:21:19 UTC
Permalink
Hi,

I have recently acquired a touch screen. After spending many years developing touch applications for iOS and Android, I have finally acquired a touchscreen for windows touch development.

I’m writing to this list because I’m facing an awkward issue with MouseArea and MultipointTouchArea.

Let’s suppose I have the following qml code:

MouseArea {
anchors.fill: parent
onPressedChanged: console.log(pressed)
}

MultiPointTouchArea {
anchors.fill: parent
onPressed: console.log(“pressed”);
onReleased: console.log(“released”);
}


What is happening here is that when I touch the screen with either of these code piece alone I’m having the same result: the output only comes after the finger is released.

At first I decided to look for Windows touch input settings and found the press and hold to right click which was enabled. But even after disabling that option, the issue remains.

I was expecting to have qml: true / qml: pressed and the finger touches the screen and qml: false / qml: released when the finger is lifted from the screen but that is not happening.

What am I missing?

I’m on Windows 10 with Qt 5.12 beta 3 MSVC2017 Qt kit.

Thanks in advance!

Best regards,

Nuno

Loading...