Apologies for coming into this discussion late and I don’t know if you’re still interested in this topic, but I’ve made some progress understanding the gas sensors’ sensitivity to temperature, humidity and air pressure changes, as well as drift over time.
I logged many hours of the gas sensors’ resistance (Rs) against the BME280’s temperature, humidity and air pressure readings. I then undertook regression analysis to determine how each Rs is impacted by those factors.
I found that:
The Red Rs varies by -9000 ohms per degree Celsius (relative to the 23 degrees C baseline referenced in the sensor’s spec) and it varies by -1750 ohms per % relative humidity (relative to the 50% baseline referenced in the sensor’s spec). I found that air pressure changes have negligible impact.
The Oxi Rs varies by -10000 ohms per degree Celsius (relative to the 23 degrees C baseline referenced in the sensor’s spec), it varies by -646 ohms per % relative humidity (relative to the 50% baseline referenced in the sensor’s spec and it varies by 2639 ohms per hPa of air pressure, relative to a 1013 Hpa baseline.
The NH3 Rs varies by -16000 ohms per degree Celsius (relative to the 23 degrees C baseline referenced in the sensor’s spec), it had negligible impact from relative humidity changes and it varies by 1526 ohms per hPa of air pressure, relative to a 1013 Hpa baseline.
There was significant Rs drift during warmup (the 6814’s application note advise 100 minutes warmup time) and a lesser level of drift after warmup. I addressed the warmup drift by doing a calibration reading of each Rs after the warmup period in clean air and placing that reading into an updated R0 for each sensor. The issue is that you need to ensure that the sensors aren’t exposed to abnormal gas levels at the time of the 100 minute calibration. I addressed the longer term drift by triggering a recalibration at a set time each day (in my case, I choose to do it at 3am when the outside air is likely to be clean), add it to a list of the daily readings for the previous week and update each sensor’s R0 based on the average of the calibration readings for that past week. Not perfect, but much better that leaving the sensors without calibration.
I also noted in this topic’s post that there were concerns that the Oxidising sensor’s resistance approached 400k, when it should be 20k max. My understanding is that the 20k figure is for R0 (i.e. resistance in air at 23 degrees C and 50% RH) and the Rs figure (i.e. what the Enviro+ measures) will be higher as the sensor is exposed to oxidising gases and time/temperature/humidity/air pressure changes. During my analysis, I was getting Oxi Rs readings between 21k and 215k.
I’ve coded the above compensation into my environment monitor https://github.com/roscoe81/enviro-monitor, which also includes algorithms to convert Rs/R0 into approximate ppm levels for each sensor, using the graphs in 6814’s datasheet.
Hope that helps.