I noticed in another post related to calibrating the HyperPixel’s touchscreen it was said that calibration shouldn’t be necessary out of the box but nothing further was mentioned regarding whether or not it was possible. The problem I’m having is that presses within a certain distance of the display’s edge register/snap straight to the edge. Is there anything I can do to fix this or is my unit defective? I made a little GIF to demo exactly what I’m dealing with http://imgur.com/r8UZo5T
Just to expand, I have the same issue on my own display; it makes pushing the “close window” button more than annoying. Could just be inaccuracy in the touch display itself I suppose.
At first I considered that it might just be inaccuracy as well but the fact that it happens primarily within 5-7mm of the display’s edge tells me there might be some weird OS resolution or overscan issue with the display/touchscreen driver. The accuracy is adequate for my usage when I stay within the central part of the panel but as soon as I need to click on something within that 5-7mm border it starts to have some severe issues. Might be worth mentioning that I installed it on a fresh Raspbian Jessie image.
Does it do the same thing if you use a stylus? My Pi foundation 7 inch touch screen does similar things if I try to touch close to the edge with my nubby finger. No such issues if I use a stylus though.
I’ve tried to use a stylus on my own display, but it actually causes more issues with touch positioning then my finger does, even in the centre of the display (granted, it’s a terrible stylus worth about 10 pence, which is most likely my issue).
Ok, it was worth a shot. Is it capacitive touch or resistive touch? I had a resistive touch screen I really liked. Nice fine tipped nylon stylus that was very accurate. The capacitive touch I don’t like as much. Mainly because my fine tipped stylus won’t work with it. I have to use a nubby soft tipped one , that is only marginally better than my finger.
The HyperPixel has a capacitive touchscreen and I don’t have a stylus for use with capacitive touchscreens on hand. I had some sponges that I otherwise use for cleaning soldering tips (damp sponges generally make for fairly decent emergency styluses) but threading a tightly folded strip of it just barely through the hole of an empty ballpoint pen didn’t improve my results along the display’s border. It was more accurate than my finger within the center area of the display but it behaves just the same along the edges.
You can get fine tip stylus that work with capacitive touch. They aren’t cheap though. The ones I’ve seen that say they work aren’t anyway. Some of the Apple ones are supposed to work with capacitive touch. I haven’t gotten my hands on any to actually test them though. It’s a trade off, IMHO. The capacitive touch screens have a nice hard gorilla glass like cover. They don’t damage very easy and smudges are easy to clean off. You need a special stylus though. The resistive touch screens have a soft gel like texture and can be easily damaged. The one I had was that way anyway. But you can use just about any stylus. I bought a Nintendo stylus, took the tip out, and put it in a Bick pen. Now that I think about it, it might have been the resistive screen I had the issue with at the edges?
I’ve used this same type of makeshift sponge stylus with capactive touchscreens in the past without any issues. It might be difficult to see exactly what I’m describing in the GIF. To clarify, there’s a border of approximately 5-7mm around the display in which 9 out of 10 presses register right on the very edge of the display. I’m going to try other distros tomorrow and see if that changes anything. Will report back, if I find a solution.
It’s a calibration issue, but not the sort of calibration you can change yourself.
We looked into it in quite some depth and found some low-level noise being induced on the bottom electrode. This over-saturates it, so that the built-in logic that the touch-controller uses assumes your finger is right at the bottom, rather than part way between two electrodes.
The software on the touch-controller itself can’t be changed or calibrated, but what we can do to fix this is read the raw electrode data and write our own multi-touch driver.
Unfortunately, that’s really hard. But it’s on our to-do list. Hopefully such a driver will also bring with it better multi-touch recognition and perhaps even some low-level gesture support. But first let’s walk before we start running.
Fortunately, the touch screen is already leagues ahead of some of the unusably awful resitive ones out there :D But we believe we can make it… BETTER!
To see all the magic behind your touch-screen, try this (which should scroll all the X and Y electrode values with a little noise suppression applied):
#!/usr/bin/env python
import time
import sys
import smbus
ADDR = 0x5c
bus = smbus.SMBus(3)
def read_byte(address, delay=0):
return bus.read_byte_data(0x5c,address)
def smbus_read_touch():
raw_adc = bus.read_i2c_block_data(ADDR, 0x00, 32) + bus.read_i2c_block_data(ADDR, 0x20, 12)
adc_out = []
for x in range(0,len(raw_adc),2):
val = (raw_adc[x] << 8) | raw_adc[x+1]
val -= 100
val = max(0,int(val))
adc_out.append(val)
touch_x = reversed(adc_out[8:])
touch_y = adc_out[:8]
print("y: " + " ".join([str(x).rjust(4,' ') for x in touch_y]) + " | x: " + " ".join([str(x).rjust(4,' ') for x in touch_x]))
def smbus_config_touch():
bus.write_byte_data(0x5c,0x6e,0b00001110)
if __name__ == "__main__":
smbus_config_touch()
while True:
smbus_read_touch()
time.sleep(0.001)
Fortunately, the touch screen is already leagues ahead of some of the unusably awful resitive ones out there :D
The display resolution, refresh rate, and affordability of the HyperPixel alone sealed the deal for me but the capacitive touchscreen is a very much appreciated addition on top.
Did you want me to echo the output to file and share any specific results on here or was it more just for me to see for myself?
Would you by any chance be able to give a rough ETA on the driver, assuming one will be pushed to production?
I don’t have a firm ETA yet - mostly because my current understanding of how it might work is theoretical. I was hoping Jon’s coding genius would crack it, but he’s a busy busy fellow. I’m probably going to have to have another crack at it!
I ran the python code above on the raspi but nothing explicitly stood out in a way that I could make sense of. I’ve used similar code as the one above for testing when first getting into reverse engineering as a hobby, only it monitored changes in mouse (the computer kind) position readings instead, little of which is likely to be applicable here except for x and y but I digress.
Thank you, for taking time out of your day to offer insight into all of this! Interesting stuff. Calibration issue aside, the HyperPixel has been all-around excellent, trouncing my previous 3.5" TFT in every single category. Best of luck to you with sorting out the touch-controller :) For the record, right-click by means of multi-touch would be really damn useful. hint hint nudge nudge
Slightly off topic, but one of my biggest disappointments with the Pi Foundation touch screen and Raspbian, is the lack of multi touch. It’s 10 point capacitive touch, but only supports single touch out of the box? Not even right click functionality. So I can se why people would want this on what ever display they are using.
Using the Python script and then touching my shiny new HyperPixel I can clearly see that it doesn’t detect anything within a certain region. Now that’s ugly. I only noticed directly after unpacking because the blind spot is right where the infamous SSH password warning dialog pops up and has its OK button. I can touch and touch and touch, but there’s no response. Only when I move the window, so that the OK button is in some other place where touches get registered, I can close the dialog. When slowly swiping my finger while running the Python script I can see that everywhere else on the screen the touches get correctly registered, but not in that particular spot below the middle right axis. Towards the edges, touches get registered correctly again.
The touch screen is driven by a discrete grid of electrodes wired up to an ADC. As such it doesn’t have 800x480 actual points of sensitivity, but rather (IIRC) 13 x 7 lines.
The line row/column intersection that your finger is closest to will read the highest value, and the next closest will have the next highest value and so on. If your finger is exactly between two intersections then they should, in theory, both read the same value. So, to get the position of a finger you find the adjacent intersections that have a highest value, and interpolate between their locations depending on their values.
IE: If you have point A and point B that are reading 75 and 25 (totally arbitrary values you wont see in reality) then you can approximate that the finger is 3 times closer to A than it is to B.
There’s no electrical limit to the number of touches you can track simultaneously on this display, but there are practical ones - sooner or later it becomes impossible to tell the difference between fingers.
(Note: This doesn’t work well on HyperPixel because it’s native tracking of two-fingers leaves a little to be desired)
As it happens I saw your post late last night and thought I should link you to this code example to confirm. Looks like you’ve taken the initiative though and beat me to it.
It sound like a bug, not a feature. I think you might need a replacement.
Yeah, I’m not sure 100% why there’s a total lack of support for gestures in Pixel, but then I haven’t used many Linux desktops, much less touchscreen-enabled ones so I don’t know what the wider state of support is like.
I’ve been working on a proof-of-concept, and possibly even deployable touch driver for Hyper Pixel which includes noise rejection and seems to track ~3 touches fairly well. Bearing in mind that fitting even 3 fingers onto this display is a challenge, but it does mean we could potentially identify 3-touch scenarios and soft-map that to scrolling or even key presses. Who knows!
If you want to have a play with my code so far, here it is:
(Note it’s pretty rough and ready and poorly architected but it will recursively identify touches until it can’t find any more)
#!/usr/bin/env python
import time
import sys
import smbus
ADDR = 0x5c
bus = smbus.SMBus(3)
WINDOW_SIZE = 10
ROWS = 8
COLS = 14
H = 480
W = 800
STEP_Y = float(H) / (ROWS - 1)
STEP_X = float(W) / (COLS - 1)
data = [[0] * WINDOW_SIZE] * (ROWS + COLS)
def read_byte(address, delay=0):
return bus.read_byte_data(0x5c,address)
def smbus_read_touch():
raw_adc = bus.read_i2c_block_data(ADDR, 0x00, 32) + bus.read_i2c_block_data(ADDR, 0x20, 12)
adc_out = [0] * (ROWS + COLS)
y = 0
for x in range(0,len(raw_adc),2):
val = (raw_adc[x] << 8) | raw_adc[x+1]
data[y].insert(0, val)
data[y] = data[y][:WINDOW_SIZE]
y += 1
for x in range(ROWS+COLS):
val = 0
for y in range(WINDOW_SIZE):
val += data[x][y]
adc_out[x] = int(val / WINDOW_SIZE)
touch_x = list(reversed(adc_out[8:]))
touch_y = adc_out[:8]
#print(str(int(time.time() * 1000)) + " y: " + " ".join([str(x).rjust(4,' ') for x in touch_y]) + " | x: " + " ".join([str(x).rjust(4,' ') for x in touch_x]))
touches = []
found = []
ignore_factor = 0.6
while True:
max_x = max(touch_x)
max_y = max(touch_y)
io_x = touch_x.index(max_x)
io_y = touch_y.index(max_y)
if max_x < 100 or max_y < 100 or (io_x, io_y) in found:
break
io_x = touch_x.index(max_x)
io_y = touch_y.index(max_y)
found.append((io_x, io_y))
b_x = (STEP_X * io_x) # + STEP_X / 2
b_y = (STEP_Y * io_y) # + STEP_Y / 2
if io_x < (COLS-1):
b_x -= 1.0 - (float(touch_x[io_x + 1]) / touch_x[io_x]) * (STEP_X / 2)
touch_x[io_x + 1] *= ignore_factor
if io_x > 0:
b_x += 1.0 - (float(touch_x[io_x - 1]) / touch_x[io_x]) * (STEP_X / 2)
touch_x[io_x - 1] *= ignore_factor
if io_y < (ROWS-1):
b_y -= 1.0 - (float(touch_y[io_y + 1]) / touch_y[io_y]) * (STEP_Y / 2)
touch_y[io_y + 1] *= ignore_factor
if io_y > 0:
b_y += 1.0 - (float(touch_y[io_y - 1]) / touch_y[io_y]) * (STEP_Y / 2)
touch_y[io_y - 1] *= ignore_factor
touch_x[io_x] *= ignore_factor
touch_y[io_y] *= ignore_factor
touches.append((int(b_x), int(b_y)))
touches = sorted(touches, key=lambda touch: touch[0])
print(touches)
def smbus_config_touch():
bus.write_byte_data(0x5c,0x6e,0b00001110)
if __name__ == "__main__":
smbus_config_touch()
while True:
smbus_read_touch()
time.sleep(1.0/1600)
Does that mean that, theoretically, you could track an indefinite amount of touches (until they all merge into one BIG touch, that is)? I will test it in a little while. :D
Well with 14 rows and 8 columns worth of ADC readings the theoretical absolute limit would be 112 simultaneous stationary touches directly on top of each intersection. That would require some rather small fingers.
The real practical limit seems to be about 4 fingers, tracked along the Y axis only. This is potentially useful for gestures if we can figure out how to recognise them as such and inject the right events into uinput.
I have since written a functional touch input driver based on my code above, which I’ve shared in Discord for those curious enough to try it. It has a fairly annoying problem with interpolation between the electrodes causing drag and drop operations to not be especially smooth. It’s also frustratingly hard to hit some icons, but that might have been because I was lazy and hadn’t cut my nails…
It proves the concept, though, and certainly fixes the issue with touches being all messed up along the bottom edge.