Control UI's click(x, y) - touchGesture

General discussions about Automagic and automation in general

Moderator: Martin

Post Reply
IAmJAG
Posts: 11
Joined: 01 Oct 2018 05:56

Control UI's click(x, y) - touchGesture

Post by IAmJAG » 10 Oct 2018 00:35

Hi -,

I need some help with click(x, y).
(Phone not rooted)*

I have a game that I have multiple accounts. I want to automate login to other account. The problem, is that the game is protected against the Control UI function. I though, I can use click(x, y), but it did not work. Now, I discovered the touchGesture can be configured to do just like click and it worked.

Though I have I a workaround, I still want to know if someone has an info how to use the click(x, y) to click any part on the screen that the Control UI could not read.

I also have a question regarding touchGesture. I need to scroll a screen so that a button is positioned at the exact location. I'm using touchGesture with this, but the behavior is not what I expected. It's like it just randomly swipe the screen upward and the button I want to drag to the position is randomly positioned in the screen.

Regards,
jAGHere

User avatar
Desmanto
Posts: 2709
Joined: 21 Jul 2017 17:50

Re: Control UI's click(x, y) - touchGesture

Post by Desmanto » 11 Oct 2018 14:40

Unfortunately there is nothing you can do with it. Control UI only work as far as the accessibility allow. Webframe (in browser and webapp), game (in game UI), some secure flagged window can't use this reliably, they are sandboxed from external accessibility. Some functions still get thru though, but because of dynamic zooming, scrolling and panning, most of the time, only click(pattern) and its kind can detect text, not Element ID available.

touchGesture() seems to be the implementation similar to root command input swipe, except without root and only available from Nougat 7.0+. I longed for this feature when I am still using lollipop. But after upgrading to Oreo, I almost never use this except for testing :D Since it is similar to input swipe, it can swipe even thru the webframe/game. But the problem is since it seems to be separated function, it can only use x y as the input; it can't detect the component or UI element inside webframe/game as the anchor. This make the random swipe because we don't know our current position inside the frame.

Although it can't be done directly, you can still use it to swipe or simulate click(x,y). use getTextInActiveWindow() and search for the label you want to anchor on. If it can't find it, use touchGesture() to scroll down/up or swipe left/right (if the scroll is horizontal). Loop this using while(). After found it, use getBounds() to get the element bound, calculate the center point of it and use the x,y point in the touchGesture() with single point only (no swipe)

It can be done, but not worth the time. As this only work in browser and webapp. (probably not realiable too). In games, they don't use label, but icon/graphic instead. No label to work on, so CUI can't detect anything to use as anchor.
To click thru any games or browser, root is required. Use execute root command, input tap x y.
Index of Automagic useful thread List of my other useful posts (and others')
Xiaomi Redmi Note 5 (whyred), AOSP Extended v6.7 build 20200310 Official, Android Pie 9.0, Rooted.

IAmJAG
Posts: 11
Joined: 01 Oct 2018 05:56

Re: Control UI's click(x, y) - touchGesture

Post by IAmJAG » 12 Oct 2018 01:34

Seems like a lot of work.

Thank you for clarifying.
Desmanto wrote:Although it can't be done directly, you can still use it to swipe or simulate click(x,y). use getTextInActiveWindow() and search for the label you want to anchor on. If it can't find it, use touchGesture() to scroll down/up or swipe left/right (if the scroll is horizontal). Loop this using while(). After found it, use getBounds() to get the element bound, calculate the center point of it and use the x,y point in the touchGesture() with single point only (no swipe)
This give me an idea.

Since getTextInActiveWindow() and getBounds() can't also be used in games, instead of using those functions, I can test if an image or a specific color is at the (x,y) position of the screen. Then, when condition is true, use the touchGesture() to click on the object I'm looking for.

The only drawback are the following.

1. takeScreenshot() is slow - I tried to use execute command screencap - but it did not work in my phone - the result file is not readable.
2. getPixelColor() - I have not tried this yet, but by experience in windows environment - this will also be slow. Testing for a color in the screen is 50% reliable. So you have to test multiple pixels.
An alternative to this is to find an smaller image in the screenshot from #1. But I think that is slower, unless it is a method that does not use getPixelColor.

So the account switch is slow. I might as well do it manually. :)

I've seen an app that find an image in the screen and return the coordinates of the top-left position. It is also fast. It can find a 100 by 100 image in 1080 screen in milliseconds - 40 to 150 ms. It is using Lua scripting. I dropped my subscription because it does not offer several functionality that AM/Tasker offers. Especially settings and environment variables. All it does is find the image on the screen. :D .

User avatar
Desmanto
Posts: 2709
Joined: 21 Jul 2017 17:50

Re: Control UI's click(x, y) - touchGesture

Post by Desmanto » 12 Oct 2018 05:10

screencap seems need root command. To use getPixelColor(), you have to init the image file first. So use action Take Screenshot, then init this image. Use getPixelColor to analyze the color. Again, it is such a painful work to be done. I haven't found a better method yet.

Probably you should just use estimation, something like recorded touch. Use the gesture to record each tap and navigate accordingly.
Index of Automagic useful thread List of my other useful posts (and others')
Xiaomi Redmi Note 5 (whyred), AOSP Extended v6.7 build 20200310 Official, Android Pie 9.0, Rooted.

Post Reply