I recently bought two Robopeak USB displays in the hope of using them for a camera project I’m making. I would like to use it in conjunction with a Pi Zero (only one, that is) to preview the camera’s image.
The Pimoroni store page states that the installer will run “on the Raspberry Pi B+, 2 and Pi Zero.” however, after downloading and running the installer, I never got it to work on my Zero (or Zero W). I tried many times, and once I managed to get it to show a very distorted and unusable image of the GUI, but I never managed to replicate it.
After some more exploration, I found that the official page for the display (http://www.robopeak.com/blog/?p=406) doesn’t explicitly state support for the Pi Zero.
Does anyone know of some way to get the drivers running on this kernel? I’ve tired with both my Pi 3 and Pi B (original), but with no success, although I didn’t expect any.
Pi Zero NoIR Camera with Preview Screen
Which installer are you talking about?
I created this: https://github.com/pimoroni/rp_usbdisplay to work around the un-maintained, impossible to update OS image that RoboPeak supply, but it relies upon me compiling the modules for every new kernel release, so sometimes it falls behind.
Thank you for the speedy reply, it’s appreciated!
Yes, I’m using that installer. I recently re-flashed my SD card to the newest Raspbian image, and I haven’t re-tried using the screen after then until now, which is why I am getting a different result I presume.
With Raspbian completely updated, the installer now recognises that the screen is plugged in, which it usually didn’t before, and gives me a useful error message. It seems that the problem is just to do with new kernel releases; although I still don’t know why I was getting the strange graphical errors I was beforehand.
I’ve attached some images just for the benefit of completeness:
Aha. Yup. New Kernel release that I was not on top of.
I’ve built the Pi 3 version, now working on the Pi Zero ;)
You should be able to re-clone the GitHub repo and try the install again :D
I know who to come to for driver problems in the future (hint hint :3).
One problem though… The return of the dreaded graphical error.
it installs fine on the Zero (I’m yet to try it on my Pi 3), and I’ve changed the xorg config file to the one provided to get it working with the GUI, but it seems inoperable. I’ve attached a new image in the google drive link (https://drive.google.com/drive/folders/0B-0UNek611hKakl1aU9tT3k2Q0E?usp=sharing), although I can’t vouch for photo quality.
I’m no expert on USB protocols or display drivers, but it looks to me like it’s just trying to display too much information on the screen at once, e.g; too high a resolution.
I am yet to change the cmdline.txt to display the terminal on it, but I suspect the same outcome, since this is what happened to me on previous versions of the kernel.
And I may as well say thanks again, because this is about the best support I’ve gotten from anybody yet!
Happy to help! And thank you ;)
I’d say you’re correct in your assumption that it’s displaying too much information. Are you using
fbcp to copy the main framebuffer to the display, or something else?
By default the total mismatch in resolution just results in a garbled mess like this, since the framebuffer is just sent as a looong list of pixels with no inherent information about its width/height. That’s why there’s a weird wrapping effect.
You can do crazy stuff like try to force the Pi’s primary framebuffer to the display’s native resolution (is it 320x240?) or you can fire up another X desktop on that display.
I’ve not played with X configuration enough recently to be of much help here, but
10-robopeak.conf, found here; https://github.com/pimoroni/rp_usbdisplay/tree/master/extra, should change the Robopeak display to the default display.
I did dig one of these displays up from my bits box just a few hours ago, though, so I might be able to fire it up and follow along.
What sort of setup are you going for?
Well, I had to google what fbcp was in order to figure out if I’m using it or not, and I’m still not sure, so I’m going to go with; probably not. Possibly, but, I’d need to check… Somehow…
The resolution looks like something ridiculously small like 320x240, so you’re probably correct with that.
The only thing I messed around with in the X desktop files was copying the 10-robopeak.conf file into the xorg.conf.d folder, which did switch over the entire GUI to the smaller screen on start. Interestingly, I did note that the touchscreen function worked on the GUI even when I didn’t have that .conf file copied, and it wasn’t displaying anything, but I presume that’s just due to it being set up as a dual I/O device.
I haven’t even started with anything yet, but my end goal is to display a preview of a Noir Pi Camera on the screen for an IR camera I want to build. I wanted to use this in conjunction with the Zero just to make things nice and compact. I purchased two just so I could use the other if I needed to hook up the Pi anywhere when I needed it a little more portable.
I’ll try getting it working on my Pi 3, but you’ll have to bare with me, since I’m using it to post here at the moment, and I’m running short on SD cards that I haven’t re-flashed 20 times over.
Edit - Success! The screen works like a charm on my Pi 3, and I even managed to get out of that Xdesktop without having to pull the plug! It seems it’s just my Pi Zeros which are having trouble.
Camera preview is an interesting idea, a quick look for how it could be done leads to this thread: https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=45059
Which brings us back to fbcp :D Which, in long-hand, is
The issue with the camera preview is that it’s rendered in the graphics chip and normally wont display on anything but the primary display. It can be made to work by copying that data over, though, at least that’s how I understand it.
There must be a better way of doing this, though, using a video feed or something like that. If you can stream the camera over the internet, then it’s got to be pretty straight-forward to stream it to a connected display!
Haha, you would hope so!
I was planning to simply retro-fit my way into making it work by running the preview straight through the GUI from boot-up by running a little bit of Python script, and then just setting up the X desktop to use the little display as it’s primary; suffice to say though, that’s all layman’s terms and I’m yet to run this in any practical situation.
I could try running the camera preview through my Pi 3 and seeing if it will send nicely to the screen, but it’s an effort trying to find such a small camera in my box of stuff! Luckily I’m just re-using one of your Pimoroni boxes from shipping to keep everything together, so it’s not too much of a hassle to team it out onto the floor.
Sounds like it is quite feasible then, but beyond my capabilities…
Ha, yeah, I know how he feels.
This app like normal app. but it does not output any text on console (some debug message send to syslog).
You can run with & to take it in background like.
Before use this run…
Any clue as to whether or not this would hog the entire Pi’s display settings? What I mean to say is, during development/debugging, if I needed to use my 21" monitor on HDMI to see the script properly, would I have to try and ‘undo’ the changes beforehand? I know this is a bit speculative at the moment, but, hey, I haven’t even attempted anything like this yet.
OK; I hope you never have to dismantle that AIY Voice project kit to get at a camera port. It’s painful.
Anyway, despite that, I’ve definitely confirmed that the screen isn’t used by default for camera preview in Python, although I haven’t actually managed to get the screen to show the GUI on the Zero yet, so I did have to test it on the Pi 3. (It shouldn’t make a difference, but I like to be thorough)
Now time to re-make the AIY kit…
I believe the camera is supported in video4linux, and various streaming distributions will show in-browser previews and all sorts of things. It must be possible to set up a streaming feed to the display that supports taking snaps. It would break the convenience of the Python camera API though.
Actually scratch all that junk below. Just for the sake of getting the camera streaming onto the framebuffer display you can use:
sudo modprobe bcm2835_v4l2 sudo SDL_VIDEODRIVER=fbcon SDL_FBDEV=/dev/fb1 mplayer -vo sdl -tv driver=v4l2:device=/dev/video0:width=320:height=240 tv://
But this locks up the video device so you wouldn’t be able to capture stills, I believe.
Using a weird combination of uv4l and
dd I’ve managed to get a capture to copy directly from
/dev/video1 (supplied by uv4l) to
/dev/fb1, preserving the colour.
If you want to give uv4l a try, you can add the apt key:
curl http://www.linux-projects.org/listing/uv4l_repo/lrkey.asc | sudo apt-key add -
Then add the following to /etc/apt/sources.list:
deb http://www.linux-projects.org/listing/uv4l_repo/raspbian/ jessie main
sudo apt-get update sudo apt-get install uv4l uv4l-raspicam
To set up /dev/video1 with uv4l I used the command:
uv4l --driver raspicam --width=320 --height=240 --encoding=rgb565
To send the capture from camera to framebuffer I used the command:
dd if=/dev/video1 of=/dev/fb1 bs=150k count=1
Right now I seem to be having trouble with really, really dark images. Yeah it’s dark in here, but not that dark. Frustratingly my first image will come out nice and bright, but all subsequent captures are just dark dark dark :/
You can also edit
/etc/uv4l/uv4l-raspicam.conf and run the system service:
sudo service start uv4l_raspicam
Okay, now I have a working albeit slightly crazy setup for watching live video on the rp_usbdisplay and capturing screenshots (albeit in 320x240 resolution, but meh) using nothing but the v4l module and mplayer.
A live stream plays onto the display, and tapping the resistive touchscreen triggers a screenshot.
sudo apt install mplayer
/etc/modules and add:
My main script is
/home/pi/mp.sh with the contents:
sudo SDL_VIDEODRIVER=fbcon SDL_FBDEV=/dev/fb1 mplayer -input conf=/home/pi/input.conf -vf screenshot -vo sdl -tv driver=v4l2:device=/dev/video0:width=320:height=240:fps=16:gain=1 tv://
/home/pi a file called
input.conf with the contents:
mp.sh and tap to shoot :D
You might also want to
crontab -e and add an
Works well as a proof of concept, anyway :D
Suggestion: you need a super like button. :D
Well, meaning no offence to your explanation, that was definitely way above my level of Linux understanding! I had originally planned on using two physical buttons on the camera (one to activate some IR LEDs and another to take the still), but I never really thought of using the screen itself as the button. Well, at least I have a spare button for another project now!
I’m afraid I’m a little busy at the moment with it being exam season, but the next opportunity I get I’ll try and get this up and running on one of my Pi Zeros and test it out!
Don’t worry too much about the small resolution images; the fact it’s working at all is a miracle to me. And, as you said, it a proof of concept to build on! :D
Query: Should I switch over to another topic now, since you’ve managed to fix the screen issues and it’s no longer a ‘support’ issue, more like a project?
You’re welcome- and yes my explanation is half way between a brain dump and an explanation, I knew that much as I was typing it :D
Tapping the screen doesn’t yield the best response, which is unfortunate, but it sort-of works :D Buttons would be far better.
I’m wondering if I can have mplayer capture the image in a higher resolution while downscaling for the screen, which would certainly improve things. Motion blur is a bit of a pain, too, but I was testing it in a dark room at far too late-o-clock!
A new topic is a good idea, if you drop a post in the projects forum explaining what you’re after I can probably move some of the relevant posts from here.