What is the best physical set up for a BH1745 RGBC sensor?

My daughter and I are trying to build a bead sorting machine. Off a 2040 we’ve got a servo to bring beads under a RGB sensor, and a stepper to control a deposit disc (that will hold containers).

The mechanics seem to work well but I’ve had really poor luck with the RGB detection. I started with a cheap (in every sense) “TCS34725” but the output was garbage. So I bought a Breakout BH1745 from you and while the range of garbage is much wider —it’s making up all sorts of wonderful colours— it’s still not what I want. Which means I’m probably doing it wrong.

There’s loads of light leak through the sides but it also doesn’t work in a pitch black room. I’ve tried messing around with measurement times, LEDs, ADC multipliers and it all just makes it worse. My conclusion is there’s something wrong with the physical setup.

So what’s the Right Way™ to use a BH1745 to measure the colour of something very small directly underneath it?

  • Do I need complete light isolation?
  • Should the chamber (and its surround in the disc that brings it to the sensor) be black or white or something else?
  • Is there anything I could be doing to calibrate the sensor on an empty chamber?
  • Is there a physical distance between bead and sensor I should be aiming for? Do I need a lens?

I don’t think there’s anything special about my code. I’m using the BreakoutBH1745 library from the Pimoroni build of MicroPython. I think this is the smallest version of my code that still works.

from machine import I2C, Pin
from breakout_bh1745 import BreakoutBH1745

i2c = I2C(1, scl=15, sda=14)
rgb_sensor = BreakoutBH1745(i2c, 0x38)
rgb_sensor.leds(1)

while True:
    rgb = rgb_sensor.rgbc_scaled()[:3]
    print('\033[48;2;{0};{1};{2}m#{0:02x}{1:02x}{2:02x}\033[0m'.format(*rgb))

Again, I’ve tried the various rgbc_raw/ rgbc_clamped/ rgbc_clamped methods too.

That’s an interesting problem? I have one and tinkered with it trying to detect if the ambient light was Red, Green, or Blue. I didn’t do anything dealing with an objects reflected light etc. The code I used is here,
Want to use a BH1745 for Ambient Lighting - Support - Pimoroni Buccaneers

If you don’t mind posting your code I’ll play around with it.

If you read the thread linked to by @alphanumeric, you will also see a post from myself. I think the Pimoroni driver is one cause of the problem, because it fiddles around with the relative sensitivity of the channels (citing myself “Especially the initial scaling levels out the spectral response as a whole.”)

The next thing what you have to do is to tune your algorithm. In fact this would be a perfect problem for a machine-learning solution: train the algorithm with a large number of beads until the result is fine.

But to be honest, you don’t need AI for this. But the “training” part is important. I would do repeated measurements with a number of beads, save the raw results from the sensor and try to do some simple statistics (something like means, trimmed-means, or median) to find a discriminant measure. I would also always do repeated measurements with every single bead to reduce measurement noise.

2 Likes

I like the idea but that feels so far outside the scope of this dinky afternoon project. I’m also ashamed to say —as a software developer— that ML is outside my repertoire. It feels like such a big topic at this point, any idea how to break in, ideally on something relevant to this?

Just to explain our code a little further. The idea was that we measure the colour, then do some maths to see how similar that is to previously measured bead colours. If they’re similar enough (RGB-> Lab, then CIE1994), a we pick the same output bucket, if they’re not, they get entered as a new colour (with its own output bucket). There’s a physical limit so if the aren’t in the first 8 colours, they get thrown into a reserved bin bucket for beads to have a second (etc) pass.

As it scans through, I have it output the colour it’s just scanned, the bucket it thinks it should go in (zero index). The problem, is the RGB sensor is so wildly inconsistent. Here is the rgbc_raw() output from a run with no beads, so it’s just scanning the chamber. Edit, won’t let me post that screenshot, but it’s a nasty mix of red, magenta, cyan. rgbc_scaled() is a little better. The colours are “wrong” but they’re much closer and —through Lab/CIE94— it can pick apart enough differentiation to use my bucket system.

So I don’t think I need ML yet, I need better input. All the ML in the world isn’t going to help if one second we’re reading a red and then it’s cyan.


On the actual code/maths quality running in the C library (which this is using), yeah, I’ve seen all sorts of jank. I’m not an imaging professional so I’m not sure how they expect their sensor to work but many come with “known” sensitivites of the relative R, G, B and W sensors on their chips so I’m not against default multipliers.

I’ve got to run but I did also find the bh1745-python library which I think is meant for the full-fat RaspberriPi. It appears to have a little more flexibility so I might port it to MicroPython (just the i²c code?) and have a go with that.

Thanks for both your help.

1 Like

I did not recommend ML. But if you are not time-constrained, I would at least try to measure 4, 16, 64(?) times and average the measurements. If you calculate mean and variance, you should see if the sensor is wildly inconsistent or is just lacking precision.

There is also this note in the datasheet: “Operating the IC in the presence of a strong electromagnetic field may cause the IC to malfunction”. So maybe you should try some measurements far away from your motors just to be sure this is not the source of the observed inconsistencies.