A quick look at Touch Handling APIs in Game Engines

Since the iPhone happened in 2007, touch input has been at the forefront of human interaction with machines. But Game Engines have existed for a reasonable long life, preceding the mobile boom. Touch as a means of interaction with the game world in engines is provided in many ways. Below, we are going to look at different touch APIs to see how they have been provided to enable gaming on our phones. Tilt, camera, and other sensors are going to be ignored so we can focus on touch.

Browser - Gold standard of Touch APIs

The browser is not a "game engine" but there are games that run in browsers and it has solid handling of touch input that is used across devices every day in different ways. Mostly, though, they are handled through DOM elements and not directly. But let's look at how their direct handling can be done.

Browser handles 4 touch events:

  • touch start: when the finger starts touching the browser content area
  • touch move: if the finger moves after touching the browser content area
  • touch end: whether the finger moved or not when it's released from the browser content area
  • touch cancel: when the browser tab changes, the finger goes to the browser UI or something else

In code, subscribing to these events requires passing the function that will handle them, and the boolean at the end gives information on how that event propagates. In the browser is also possible to specify the element that is susceptible to that event - these are not specifics for touch but give interesting functionalities.

function startup() {
  var el = document.getElementById("canvas");
  el.addEventListener("touchstart", handleStart, false);
  el.addEventListener("touchend", handleEnd, false);
  el.addEventListener("touchcancel", handleCancel, false);
  el.addEventListener("touchmove", handleMove, false);

Each event passes an array of Touch elements, each with the following parameters: identifier, clientX, clientY, pageX, pageY, screenX, screenY, target. More elements are available experimentally depending on the browser. The identifier property is a unique integer for each touch and remains consistent for each event during the duration of each finger's contact with the surface.

Desktop browsers, even when a touch screen is available, can disable these events. A different API for Pointer Events exists and is recommended to use instead, and mixes Mouse Input, Touch input, and Pen input in the same event type.

Unity - The most complete API

In Unity, you can check Input.touchCount to figure out how many touchpoints are available at that frame, and retrieve each one using Input.GetTouch(i).

Each touch will have the following properties: fingerId, phase, position, deltaTime, radius, pressure, rawPosition, deltaPosition, azimuthAngle, altitudeAngle, maximumPossiblePressure, radiusVariance and type (whether a touch was of Direct, Indirect (or remote), or Stylus type).

The following are the possible touch phases: Began (A finger touched the screen), Moved (A finger moved on the screen), Stationary (A finger is touching the screen but hasn't moved), Ended (A finger was lifted from the screen. This is the final phase of a touch) or Canceled (The system cancelled tracking for the touch).

Because both the index i used in Input.GetTouch(i) and fingerId are integers, in some touch APIs that require passing a integer is not obvious what to use. For example EventSystem.current.IsPointerOverGameObject(int) actually requires passing a touch.fingerId

GameMaker - Multiple mouses approach!

Game Maker has two separate APIs for dealing with touch. One is the same used for mouse input, but they work the same way a PC would work if it had multiple mice attached, which makes it different from what we have seen so far. You pass a device number that goes from 0 to n, where the number of devices available appears to be 5 - but I can't find the function that retrieves this number. Then 0 is the first finger that touched the screen (or the mouse in a computer) and 1 is the second one and so on, and you can use a function like device_mouse_check_button(device, button); for retrieving if either the finger is pressed (by checking left click) or the mouse is clicking, with the device being this number that ranges from 0 to n.

The following functions are available: device_mouse_check_button, device_mouse_check_button_pressed, device_mouse_check_button_released, device_mouse_x, device_mouse_y, device_mouse_raw_x, device_mouse_raw_y, device_mouse_x_to_gui, device_mouse_y_to_gui,  and device_mouse_dbclick_enable.

An additional, second API is available that gives access to all common touch gestures and makes them easy to use them. These include tap, drag, flick, pinch, and rotate finger events.

Godot - A minimal approach

Godot has very lean InputEvent for screen touch, with three parameters: index (a number from 0 to n identifying a finger), position (x and y point in the screen), and a boolean named pressed that is false when the finger is released.

At the same time, since it's an InputEvent, it's can propagate through the SceneTree similar to how the browser allows handling events in elements in the DOM. This makes this simple approach still very powerful.

Additionally, Godot also provides TouchScreenButton to design buttons meant to receive the touch of multiple fingers.

Unreal - Traditional with a timestamp

Unreal TouchEvents are similar to the ones we have seen so far, the touch has a unique Handle per finger, it happens at a TouchLocation (x,y position), it has a Type (similar to previous seen phase), it has a float indicating a Force and it has a DeviceTimestamp so you can precisely know at which time the specific touch occurred.

A touch Type can be one of the following enum values Began, Moved, Stationary, ForceChanged, FirstMove, Ended, NumTypes (used when iterating across other types).

Overall, it's a very standard API. Events are also available for On Input Touch Begin, On Input Touch End and others.

Ogre3D - Very similar to SDL2

Ogre3D provides TouchFingerEvent where each finger has a unique integer fingerId, a type, an x and y position, and additionally a dx and dy is provided for the delta of that position.

Love 2D - Polling and events and Lua

Love2D allows to both poll for the touch on an engine update or to use a function to work with touch events.

When polling, love.touch.getTouches() retrieves a table named touches, which has a list of ids for each touch. The position of a touch can then be retrieved by using local x, y = love.touch.getPosition(id) for each id. Because Love2D uses Lua, polling is very fast.

The events available are love.touchpressed, love.touchmoved and love.touchreleased. In each of them, the passed parameters are id, x, y, dx, dy and pressure.

Ren'Py - Gestures

Using gestures instead of handling the position of where a touch happens is a very different approach, Ren'Py docs explain this neatly:

The gesture recognizer first classifies swipes into 8 compass directions, "n", "ne", "e", "se", "s", "sw", "w", "nw". North is considered to be towards the top of the screen. It then concatenates the swipes into a string using the "_" as a delimiter. For example, if the player swipes down and to the right, the string "s_e" will be produced.

This allows building gestures in string, config.gestures is available as a dictionary of sorts where a string representing a gesture maps to a function (an action): define config.gestures = { "n_s_w_e_w_e" : "progress_screen" } .


When researching for this topic I looked into HaxeFlixel Actions, and while there's no obvious way of handling touch there other than the mouse, it has a very interesting system of matching input to a function that will execute its gameplay result and it seems similar to what is available in both Ren'Py and Game Maker premade gestures. Still, gesture handling in the engine seems to be a minority. 

Looking at the touch APIs of most engines above, they seem decided to give you the meaningful data available that is as close as the user input and let you handle and interpret it as you wish. In the early days of multi-touch APIs I believe I saw even one x and y position per "pixel" touched which was very demanding to go through. By giving a position per finger per frame it's data that can be handled without as much effort and from the documentation it appears to be what is being made available in the current APIs. In some engines though, the position is a normalized float between 0 and 1 and you are on your own to convert this in your world coordinates.

Godot seem to work with the leanest at just an ID per finger, a position, and information so you can tell when the finger is down and when it leaves the screen. If you are looking to the minimal you can have to work with mobile devices when building your game, I believe it hits it. 

Gitlab Runner with self-hosted Gitlab and Sonatype Nexus with SSL

A Sonatype Nexus and the internal self signed Gitlab instances were the only resources available to this CentOS 7 Server we were dealt. Recently I and a dev configured a Gitlab Runner Docker for CI builds. This is our story. Dun Dun

Note: I am assuming that you don't want to use proxy for some reason. If you can access external resources through some corporate proxy, you may configure that, and it will probably be easy and work. We found out that using Sonatype Nexus was faster than reaching outside using proxies, and in this particular machine we configured, we couldn't access through proxy unless with our personal keys for authentication, which was undesirable.

Configuration with Self Signed Certificates

We need to have self-signed certificates that are available for install on the machines. Since this Gitlab uses these certificates, connecting directly to it without them will result in SSL errors. The first thing to do is downloading them, and if you don't know where they are available you may need to contact someone to point the URL to you, but they are most sure available.

The first thing we will need is the package ca-certificates (which may be already installed).
sudo yum install ca-certificates

We need to activate certificate management
sudo update-ca-trust enable

Go where the self signed certificates you need are made available, and download them. You will need both the emitter and root certificates. You can get the URL of the root certificate from other certificate by issuing the following command:
openssl x509 -text -inform DER -in justDownloadedCertificate.cer

Name both downloaded certificates as ca-mydomain-root.crt and ca-mydomain-emitter.crt.

Copy these files to /etc/pki/ca-trust/source/anchors

Execute the following command to update managed certificates
update-ca-trust extract

Test the SSL negotiation with your Gitlab server
openssl s_client -connect gitlab.mydomain.com:443

At the end of the output, if everything is fine, you will get the following: Verify return code: 0 (ok)

Configuration of Docker on the Host with Nexus

Here we are going to use Docker to get Gitlab Runner. So first thing to do is installing Docker. We really want RedHat fork of Docker, because it allows easily using a different Docker Registry than DockerHub, so this is done like this.
yum install docker

Now that Docker has been installed, you may create a docker user and group if you like. Notice Docker Daemon needs root access to your computer. We will assume your self-hosted internal Sonatype Nexus is available at http://nexus.mydomain.com/nexus/ . Note we can also use https here, but we will need the certificate, like we did at the previous step. If your self-hosted Nexus uses the same root certificate, then using the https Nexus URL should work.

I am assuming you have everything available on Sonatype Nexus, so we are going to configure the CentOS server to pull Docker images from there.

First, you need to figure out your Nexus Docker Registry proxy port! This will be under the Docker Registry Repository HTTP or HTTPS connector. I didn't had access to read this information directly through Sonatype Nexus interface, so it required a phone call. So this is how the number 8123 will appear from magic below.

Edit /etc/sysconfig/docker text file with any editor (eg: sudo vi /etc/sysconfig/docker ) and add the lines at the end.

Restart Docker service
sudo systemctl restart docker

Everything should be good now, let's pull the Gitlab Runner image
docker pull gitlab/gitlab-runner

Assuming everything worked out, we can move on.

Configuring Gitlab Runner through Docker

Now the machine can get packages and has SSL access to internal network websites, we can configure Gitlab Runner to have access to all this too. It's useful to read the documents regarding Gitlab Runner and Docker for Gitlab Runner.

Let's create a folder to store configurations
mkdir /opt/gitlab-runner

Now, let's create a folder to store the certificates
mkdir /opt/gitlab-runner/certs

We need to convert the certificates to PEM using the following command line
openssl x509 -in ca-mydomain-emitter.cer -inform DER -out ca-mydomain-emitter.pem -outform PEM
openssl x509 -in ca-mydomain-root.crt -inform DER -out ca-mydomain-root.pem -outform PEM

Now we need to create a bundle from these certificates, to connect to the Gitlab server with SSL.
cat ca-mydomain-root.pem ca-mydomain-emitter.pem > /opt/gitlab-runner/certs/ca-mydomain-bundle.pem

A small note here, if you are working with Java, it unfortunately doesn't use the system certificates and instead it has it's own folder, so you will need to add these certificates there too, and possibly your Sonatype Nexus instance certificate if it uses a different root certificate, to make sure Gradle works.

Almost all set, on Gitlab, go into the settings for the repository you wish to build and get the identifier token for it. At the time of writing, typically under Settings, CD/CI, Runners, Specific Settings, with some name like xxxxYYY_ZZ4S.

Let's register the runner with the internal Gitlab Server.
docker run --rm -t -i -v \
/opt/gitlab-runner:/etc/gitlab-runner \
--name gitlab-runner gitlab/gitlab-runner register --non-interactive \
--url "https://gitlab.mydomain.com" \
--registration-token "xxxxYYY_ZZ4S" \
--description "my-docker-runner" \
--tls-ca-file "/etc/gitlab-runner/certs/ca-mydomain-bundle.pem" \
--run-untagged \
--locked="false" \
--executor "docker" \
--docker-image "docker:stable" \
--docker-privileged \
--docker-volumes /var/run/docker.sock:/var/run/docker.sock

You may add a specific tag for your runner too with --tag-list "mytag", just make sure to actually have it on your repository otherwise it may prevent the runner from starting.

Last step, let's initialize the Gitlab Runner with Restart Always. This will ensure that when the Docker is initialized, it will already start the runner.
docker run -d --name gitlab-runner --restart always \
-v /opt/gitlab-runner:/etc/gitlab-runner \
-v /var/run/docker.sock:/var/run/docker.sock \

It works!

It really does, just push your commits and see thing happening!

Tea for Two - Development during Adventure Jam

Hello, I would like to talk a bit on Tea for Two which is my entry for Adventure Jam 2019, and the ideas and inspirations for it.
This is the first Adventure Game I write and design, along with the usual management and coding. Adventure Jam is a two weeks event, I only selected the tools and thought about the prompt before, and used the first week to actually build the story and design the game, mostly on paper, and the second week to implement everything. One base idea was to explore the mood as dark days, but without taking itself too serious.
I decided to use the opportunity to create a game that relates to Future Flashback. In previous jam I participated, there is always a point when the team thinks "we can grow this game later" but after the jam each person goes back to do their thing, so with this in mind, I decided to make a small story, that is self-contained, but can be expanded. Guilherme has built a whole universe and timeline for Future Flashback, while that game focus more on the characters and their stories, this leaves a lot to explore by picking specific events and particularities from this universe. I briefly talked with him about the idea I had to play around Logan, and build a part of his past, and he gave comments around, but ultimately said don't worry, do your thing - he was very focused on the level he was working on Future Flashback.

Since Logan is a detective, this would allow me to make a detective story, which I have been reading about on scriptwriting books. And with this came to the part of figuring out the crime, the events that lead to that crime, and the minimal environments needed for the crime. This made me create the first sketch of what the story would be.

Once I had this in mind, I started piecing how the game would work, and to me, I really wanted to pick similarities on the Sierra Interface, where you pick special mouse cursors to do things, but I wanted to do something different for the buttons. I really liked how Wadjet Eye solves in Blackwell games how Joey is always there to make the look ats conversations instead of a person talking to themselves, and I also like a lot on Firewatch when the player talks through, and I liked the over radio sound too, but I didn't want to lock myself at explaining things over radio. I also liked the Unavowed proposal of having an inventory of character abilities, so I decided to juggle around these ideas.
I wanted to make the character talk about the room environment, and talk about clues, and being able to figure clues from the environment by talking. To reduce scope, I decided there would be no objects inventory.

When I was a kid, I played a lot of times with my sister a game that here in Brazil is called Scotland Yard, but elsewhere, in US and UK, it's called 221 B Baker Street. In this game, each time there's a case, the case presents elements from it that you have to figure out things, like a murder weapon, where the murder happened, why and other elements, and you play walking around in a board, competing with other players to solve the crime first.

Originally I wanted the player to be able to talk about things in the scene, and the clues at any time, but this proved challenging to write the dialogs during the jam timespan. I mostly wanted the dialogs to be interesting and do at least one thing:
  • help you solve the crime, 
  • give a piece of background information on a character, 
  • give a piece of background information on the world, and
  • be fun. 
Another idea in the scope was that the player should be able to talk to any character about anything, not just specific police specialists, but this proved difficult to write and not be boring, for me. Now I need to do a pause and tell you that writing in English, and specifically writing dialogs in English, for me, it's very hard, so my speed of writing to make it work is slow. I spent some days with a paper notebook writing the dialogue.
After I had written, and typed everything, I showed to the musician of Future Flashback, Jordan, and we talked about the script and the dialogs, and many details and reviews came up during our conversations, and he helped make my jokes work in English too, but the main characters, even though they are Americans, they have families from foreigner background, and I wanted to keep some ideas I originally had.

Once the basic dialog lines were figured out, I showed the initial build to some people. It had just bad placeholder graphics, and no sound of any type, and the reception was bad, people complained they didn't want to be clicking and reading a text, they thought the game made no sense. I then added some game elements that to me gave the game a lot of rhythm, and I showed it to more people, and I got back the complaint "these game elements, they remove me from immersion, I like the texts, but the game elements don't feel rewarding at all". So now at this point I have no idea what to do. I decided to go with the vision and finish as is.

At this time, Jordan gives me his last review of the game dialogs, and I ask help for Morgan Willcock, from AGS Forums, to give me some input and he decided to give me his review of the game dialogs.

From there on, the pieces eventually just fell in, after reading the many encouraging messages from Sally Beaumont on the Adventure Jam discord channel I eventually gather strength to start messaging the voice actors. Every actor I approached, agreed with the role and the conversations with them are way easier than I could imagine and they start working on their lines quickly. Francisco warns me that cutting the audio files will take double the time I imagined, and he was right, it was the most time consuming task. Meanwhile Ricardo reached a good spot of the level he was making for Future Flashback and we talk and he is focused on delivering the needed backgrounds, meanwhile, he convinces his wife Melany to join us, and she quickly makes the awesome cover the game ended having. At this point, I also am in need of music and ambient sounds, and after asking Edwyn Tiong and Arishgokol, both join in and quickly deliver their contributions. After each Voice Actor delivers their voices, I deliver them to Edwyn to add the radio effect that ended in the final version.

The end of the development is a blur of lack of sleep, cursing the computer, all I remember is that the game got done in the end and I won a weekend laying in the sofa alternating between sleeping, watching tv and eating.

Completely unhealthy development but got it done.

It's 2018 already!

So 2016 passed, and then 2017 and it's now 2018. Life just goes faster the older you get... Time for some updates.

update time!

So I decided to give a pause on A Glass of Lores - starting out building a full blown RPG and engine as my first project maybe wasn't very smart. The story keeps refining slowly on my Google docs... But, cool thing. I started to make a new game! And I am NOT making an engine this time.
This game is Future Flashback and I've been screaming a lot about it on social networks, so if you never heard about it, please click through the website, there is a lot of material there.
I usually throw a single thing I made or some instruction here, but instead I will just flow through some things...

book I've read: Driving Results through Social Networks by Robert J Thomas

It states the need to align the culture of an organisation with its business strategy, and the importance of finding the influencers in the company social network. It also presents the hypothesis that innovation usually arrives from teams instead of a single person, so to favour innovation you have to have more teams, and offer the idea that if people in the organisation can have more connections, you have a bigger flow of ideas that are possible to come to fruition - you need a multitude of disciplines to generate good profitable innovation.
Also the network will have people working against innovation, which is important to take note. Also a risk is presented when a single person gets too many connections but accepts all incoming demands, becoming a bottleneck in the network. One interesting passage attribute the managers the work of exception-handlers, and so the less exceptions and more routines the organisation encounters, less managers will be necessary.

Creepy writing

Some years ago, I've read Writing Ethnographic Fieldnotes , which talks about writing fieldnotes from observed behaviours, experiences and interactions with people and between people. I found this a good hobby for when I am alone travelling, eating and only have my phone, so I take the opportunity to write about some relation I see around me. My Google Docs file where I write this is called Everyday Scenes. Here is an excerpt:
The woman waits patiently. Sit shrunked in her chair, she swipes throgh group conversations in her phone. The man she was waiting appears, and sits besides here. She doesn't realize he is there until he touches her neck. She breaks from her phone. From their gestures, they appear late to the cinema, and they rush on the movies direction. They looked very happy.

Coding things

Future Flashback is made in Adventure Game Studio, so lots of recently things have been AGS related. You can skim through either my github profile or my brand new portfolio page. That page is made using Jekyll, I used a prebuilt theme and just customised some details, which is why I was able to create it in a weekend.

I want to write more here! Hopefully before 2019! 

Powered by Blogger.