• Please visit and share your knowledge at our sister communities:
  • If you have not, please join our official Homebrewing Facebook Group!

    Homebrewing Facebook Group

BruControl: Brewery control & automation software

Homebrew Talk

Help Support Homebrew Talk:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Just in case @BrunDog is listening... MAX31888 would be a great addition...

General Description
The MAX31888 is a 1-Wire high precision, low power
digital temperature sensor with ±0.25ºC accuracy from
-20ºC to +105ºC for precision temperature monitoring. The
MAX31888 operates at 68μA operating current during
measurement and has 16-bit resolution (0.005ºC).
 
Apologies for a fairly basic question (I think), but does the input (eg thermistor) and output (eg SSR control wire) for a PID element need to be on the same interface, or can we split it and have for eg some inputs on one ESP32 and the associated outputs on another?
 
Have to be on the same interface. I asked this a long time ago and the response was because of “safety”, the PID could overheat if the other interface where the input was went down. When you are in the PID Element pane, you can only select inputs on the same interface.
 
Have to be on the same interface. I asked this a long time ago and the response was because of “safety”, the PID could overheat if the other interface where the input was went down. When you are in the PID Element pane, you can only select inputs on the same interface.
Ok, thanks @oakbarn - suspected as much but thanks for the confirmation 👍
 
Apologies for a fairly basic question (I think), but does the input (eg thermistor) and output (eg SSR control wire) for a PID element need to be on the same interface, or can we split it and have for eg some inputs on one ESP32 and the associated outputs on another?
See my post
 
AA-1 IC died due to one of my Johnson proportional valves drawing too much current (30mA+). The other three channels worked, but when I went to use one, I killed the entire chip. Then I killed the IC I borrowed from my spare AA-1, that is when I got out the millamp meter.

The LT1639CN is discontinued, it can be found, but is $12 each... I found a similar LT1014CN is cheap on ebay ($10 for 10), the difference is 22V rating instead of 44V, so if you are using 12V, you should be good.. I will buy the 44V version when I get the problem sorted, but not sure what is going on, I swapped the valve control for a brand new one and same issue, so I have tome troubleshooting to do.... anyway, if you have an AA-1 and need a cheap spare, eBay has your hookup.
1713793455883.png
 
Question on pressure/volume sensors. I'm planning to add pressure sensors to my 2 vessel system. Considering the pros/cons of a bottom vs side mount for the sensors.

Bottom mount:
Pros -- measurement to zero volume.
Cons -- higher potential for overloading the sensor, liquid retention if tri clamp fitting (0 retention if selecting NPT fitting).

Side mount:
Pros -- low risk for overloading, no liquid retention
Cons -- No volume measured until ~1.5-2 gallons (in my vessel)

Are the risks of a bottom mount worth the added visibility to <1.5 gallons? I could reduce the risk by installing directly beneath the heating element. Are the NPT volume sensors still accurate (flush mount)? I assume they should be fine, but hoping someone has some experience. Reading back through this thread, flow meter is likely a better indicator of volume transferred, which reduces the value of a pressure sensor.

Help me make a decision on this..!
 
I pondered the same question when I built, and ultimately I ended up with a side mount as low as possible (about 1.5 gallons in a 20 gallon pot). Having run it this way for 4+ years now, I have zero regrets about its placement.
 
I should be grateful for some help with my BruControl Setup for fermenting please.

I am using an Arduino Mega, 2 x PT100 with Adafruit Max31865 shields, and standard 6 x Arduino type relay board, and the issue is that during my first trial run.

Everything is stable when idling, as well as when cooling is called for, but when the heat is activated my temperature sensor / reading then goes haywire (generally in the minus range) for about 10 to 30 seconds or so (see photo) before it re-stabilises.

The situation appears to be an issue that just happens on the FV1 side, as the FV2 side appears stable.

Therefore, I should be grateful for your thought on what could be the issue with my system please? Thanks.
Ferment.jpg
 
Is the heat source common for both FV1 and FV2 or individual ? If individual maybe try swapping them both (heat sources) and see if the problem changes with the exchange.
 
Last edited:
The heat is on a separate supply (240 volts) which is triggered by an individual relay.
 
Do you have good separation of wires/cables between your relay and PT100 filter, I'm wondering if this is an EMI issue? When heating is called for it may be causing a spike which translates over to one of your PT cables/leads thus causing the negative reading.
 
Hi Tartan1, Thanks for your prompt reply, and although there is some separation between the wires, but you could be right with an EMI issue. I will look at move the MAX31865’ away from some nearby cabling. Thanks again.
 
Doing my monthly backuo to dropbox. If you do NOT have a good backup plan, you will not be happy one day. There is an automated backup in BruControl ( to your local drive) which is great, but one day that might not be enough!
 
Great bit of advice Oakbarn, and I have got into a habit of backing up my data, with the use of a dedicated software to send my data to a Qnap Server.
 
I'm switching to a proportional SSR -- specifically the Crydom PMP2425w. Can I run this with 5v PWM (not duty cycle) alone? Or should I convert the signal to an analog 0-5v in front of the PSSR?
 
It's been a while since I shared here, mostly because BC continues to run great, and my setup has been stable.

Problem
That said, I kept having issues with Tilt accuracy, as changing the battery throws off its original calibration, which is why the Tilt/Tilt2 app has its own calibration means. I was getting frustrated with changing my offset each time I measured my wort with my EasyDens to maintain some level of accuracy since 1) that wastes wort and 2) Supabase database carries inaccurate numbers until I adjust it (and I want some accuracy for a personal phone app and data analysis project I am doing).

Solution approach
Calibrating the Tilt for BC has been covered a bit in this thread, but I wanted to offer my approach to get it accurate enough for government work without having to calibrate with a custom wort/sugar solution each time the battery is changed and without using a CSV in a lookup table. It requires a script, and the Tilt must be properly calibrated in pure water (with another gravity tool, like an EasyDens, agreeing the water is 1.000).

Assumptions
I work from the assumption that Tilt calibrates it at the factory with the OEM battery, hardcoding both the highest calibration (e.g., 1.120) and lowest (1.000). From there, the assumption is that Tilt uses a linear scale factor to convert X-degree of angle to Y gravity. Since the Tilt can't have the upper calibration changed, we have to work with whatever it thinks the scale is between the highest calibration and the new 1.000. If the battery weight makes that scale factor smaller or larger, we have to deal with it. But it is still linear in some way.

With that, since gravity drop is also linear, we can use the OG of the wort to set our own scale factor, where the Tilt is the independent variable, and the actual SG (as referenced by an EasyDens or standard hydrometer, for example) is the dependent variable. By working from the OG of the given beer, you don't have to do a full-scale calibration when you change the battery (e.g., making a solution with a known gravity and getting what the Tilt thinks it is). It gets enough precision to make the Tilt data meaningful for whatever need. You also don't need to use a look-up table or spreadsheet (though, I use an Excel sheet with similar math below to see what I should expect so I can check my script works well).

I hope this helps someone else who battles Tilt accuracy issues after changing the battery.

BC setup for the code
To start, you need your Tilt device setup without any calibration properties. You also need three globals: one for your Tilt's OG measurement, one for the actual OG (as measured by other means), and one for your adjusted SG value. Mine is automatically set based on Brewfather data, but you can just set the global in BC when you are transferred to the fermenter.

The code
You can use and adjust this script accordingly for your layout/format. For me, "BB1" is used at the beginning of all my devices and globals for my BrewBuilt fermenter (1 being used to distinguish a potential future 2nd BrewBuilt fermenter; I do the same for my Spike and Fermzilla fermenters).

Code:
//Initialize the local script variables

new value oldTilt    //Gets what the Tilt device shows so the script will trigger when the value changes
new value ogEasyDens    //My measured OG, which I get from my EasyDens
new value ogEasyDens2    //Used for some math
new value ogTilt    //What the Tilt showed at the start of fermentation
new value ogTilt2    //Used for some math
new value ratio    //My scale factor
new value sgTilt    //What the Tilt currently shows
new value sgAdj    //The new SG value when set back to an actual SG format

oldTilt = "BB1 Tilt" SG    //Set what the starting Tilt value is at when the script starts

[Loop]
wait "BB1 Tilt" SG != oldTilt    //Wait until the Tilt's shown value doesn't match the old one
oldTilt = "BB1 Tilt" SG    //Reset the oldTilt value so this can trigger again on the next change

ogTilt = "BB1 Tilt OG" value    //Get data from the global used to store the Tilt's OG value
ogTilt2 = ogTilt - 1    //Subtract 1 so you are left only with the decimal value (or "gravity points")
ogEasyDens = "BB1 OG" value    //Get data from the global used to store the actual OG value
ogEasyDens2 = ogEasyDens - 1    //Subtract 1 so you are left only with the decimal value (or "gravity points")

ratio = ogEasyDens2 / ogTilt2    //Calculate the scale ratio (the slope... dependent variable over the independent variable)
ratio precision = 6    //Get more precision out of the ratio for more accuracy in the new value (i.e.., minimize inaccurate rounding)

sgTilt = "BB1 Tilt" SG - 1      //Get the current Tilt value and subtract 1, leaving only the decimal value (or "gravity points")
sgAdj = sgTilt * ratio    //Multiply the gravity points by the scale value
sgAdj precision = 6    //Adjust to 6 decimals of precision to limit rounding
sgAdj += 1      //Add the 1 back to show an actual gravity value

"BB1 Gravity" value = sgAdj      //Set the adjusted gravity global to the calculated actual SG value

goto "Loop"    //Return to loop and wait for the Tilt to have a change in its value

Now, you should see what the gravity is with better accuracy and use that adjusted SG global for whatever you need (e.g., show on a custom site/app, push to Brewfather, trigger fermenter actions, etc.).

Limitations
As the calibration is based on OG readings of the Tilt and other means, the scale value is less precise than if you made a high-gravity solution and calculated a static scale value until the battery is changed again. But I don't want to make unnecessary high-gravity wort for this, and this has so far been accurate enough for (as a DC resident) government work.

My random gravity readings during fermentation using my EasyDens has returned the expected gravity readings based on this script within one gravity point, and that's just due to rounding, which we can't really control.

Even cooler would be if we could set the Tilt device's calibration functions via script to not have to use a global for the adjusted value, e.g.:
"BB1 Tilt" offset = -1
"BB1 Tilt" linearmultiplier = ration,
"BB1 Tilt" offset = +1)

But I get how tricky it would be when the calibration values in the properties have to be set in order and stored for the device.
 
Last edited:
It's been a while since I shared here, mostly because BC continues to run great, and my setup has been stable.

Problem
That said, I kept having issues with Tilt accuracy, as changing the battery throws off its original calibration, which is why the Tilt/Tilt2 app has its own calibration means. I was
I was looking over this and changed a little so it made sense to me using my camel code

My code:
new value vVoldTilt //Gets what the Tilt device shows so the script will trigger everytime the Tilt value changes
new value vVogExternal //My measured OG, which I get from my EasyDens
new value vVogExternal_2 //Used for some math
new value vVogTilt_1 //What the Tilt showed at the start of fermentation
new value vVogTilt_2 //Used for some math
new value vVratio //My scale factor
new value vVsgTilt //What the Tilt currently shows
new value vVsgAdj //The new SG value when set back to an actual SG format
vVoldTilt = "2" SG //Set what the starting Tilt value is at when the script starts it will make first reading then with every change
[Loop]
//my Green Tilt is on my My Main Brewery Mega at port 220 (MB_220_)
wait "MB_220_Green_Tilt" SG != vVoldTilt //Wait until the Tilt's shown value doesn't match the old one
vVoldTilt = "MB_220_Green_Tilt" SG //Reset the vVoldTilt value so this can trigger again on the next change
//*******************************************************************************************************
//*******************************************************************************************************
vVogTilt_1 = "gblV_Green_Tilt_OG" value //Get data from the global used to store the Tilt's OG value
//*******************************************************************************************************
//*******************************************************************************************************
vVogTilt_2 = vVogTilt_1 - 1 //Subtract 1 so you are left only with the decimal value (or "gravity points")
vVogExternal = "gblV_External_OG" value //Get data from the global used to store the actual OG value
vVogExternal_2 = vVogExternal - 1 //Subtract 1 so you are left only with the decimal value (or "gravity points")
vVratio = vVogExternal_2 / vVogTilt_2 //Calculate the scale vVratio (the slope... dependent variable over the independent variable)
vVratio precision = 6 //Get more precision out of the vVratio for more accuracy in the new value (i.e.., minimize inaccurate rounding)
vVsgTilt = "MB_220_Green_Tilt" SG - 1 //Get the current Tilt value and subtract 1, leaving only the decimal value (or "gravity points")
vVsgAdj = vVsgTilt * vVratio //Multiply the gravity points by the scale value
vVsgAdj precision = 6 //Adjust to 6 decimals of precision to limit rounding
vVsgAdj += 1 //Add the 1 back to show an actual gravity value
"gblV_Green_Tilt_SG" value = vVsgAdj //Set the adjusted gravity global to the calculated actual SG value
goto "Loop" //Return to loop and wait for the Tilt to have a change in its value

What I do not understand is the
//*******************************************************************************************************
//*******************************************************************************************************
vVogTilt_1 = "gblV_Green_Tilt_OG" value //Get data from the global used to store the Tilt's OG value
//*******************************************************************************************************
//*******************************************************************************************************

where does gblV_Green_Tilt_OG gets its datapoint? Is it a measured value that uou input to the global? I have not messed with my tilts but will shortly.
 
I was looking over this and changed a little so it made sense to me using my camel code

My code:
new value vVoldTilt //Gets what the Tilt device shows so the script will trigger everytime the Tilt value changes
new value vVogExternal //My measured OG, which I get from my EasyDens
new value vVogExternal_2 //Used for some math
new value vVogTilt_1 //What the Tilt showed at the start of fermentation
new value vVogTilt_2 //Used for some math
new value vVratio //My scale factor
new value vVsgTilt //What the Tilt currently shows
new value vVsgAdj //The new SG value when set back to an actual SG format
vVoldTilt = "2" SG //Set what the starting Tilt value is at when the script starts it will make first reading then with every change
[Loop]
//my Green Tilt is on my My Main Brewery Mega at port 220 (MB_220_)
wait "MB_220_Green_Tilt" SG != vVoldTilt //Wait until the Tilt's shown value doesn't match the old one
vVoldTilt = "MB_220_Green_Tilt" SG //Reset the vVoldTilt value so this can trigger again on the next change
//*******************************************************************************************************
//*******************************************************************************************************
vVogTilt_1 = "gblV_Green_Tilt_OG" value //Get data from the global used to store the Tilt's OG value
//*******************************************************************************************************
//*******************************************************************************************************
vVogTilt_2 = vVogTilt_1 - 1 //Subtract 1 so you are left only with the decimal value (or "gravity points")
vVogExternal = "gblV_External_OG" value //Get data from the global used to store the actual OG value
vVogExternal_2 = vVogExternal - 1 //Subtract 1 so you are left only with the decimal value (or "gravity points")
vVratio = vVogExternal_2 / vVogTilt_2 //Calculate the scale vVratio (the slope... dependent variable over the independent variable)
vVratio precision = 6 //Get more precision out of the vVratio for more accuracy in the new value (i.e.., minimize inaccurate rounding)
vVsgTilt = "MB_220_Green_Tilt" SG - 1 //Get the current Tilt value and subtract 1, leaving only the decimal value (or "gravity points")
vVsgAdj = vVsgTilt * vVratio //Multiply the gravity points by the scale value
vVsgAdj precision = 6 //Adjust to 6 decimals of precision to limit rounding
vVsgAdj += 1 //Add the 1 back to show an actual gravity value
"gblV_Green_Tilt_SG" value = vVsgAdj //Set the adjusted gravity global to the calculated actual SG value
goto "Loop" //Return to loop and wait for the Tilt to have a change in its value

What I do not understand is the
//*******************************************************************************************************
//*******************************************************************************************************
vVogTilt_1 = "gblV_Green_Tilt_OG" value //Get data from the global used to store the Tilt's OG value
//*******************************************************************************************************
//*******************************************************************************************************

where does gblV_Green_Tilt_OG gets its datapoint? Is it a measured value that uou input to the global? I have not messed with my tilts but will shortly.
Hah, love seeing the differences in setups and global/device names. That's what makes BruControl so great. You can truly choose your own adventure (and run your water sprinklers with it, if you wanted)
 
Hey all...started getting my feet wet with scripting last night and of course I've immediately hosed myself by endlessly looping an auto-starting script. This has led to my Brucontrol application freezing and becoming unresponsive which requires me to kill it with Task Manager...similar to what @oakbarn experienced here: https://www.homebrewtalk.com/thread...trol-automation-software.624198/post-10284815
Can someone tell me where the scripts are saved within the app so that I can remove it and get back to breaking something else?
 
Hey all...started getting my feet wet with scripting last night and of course I've immediately hosed myself by endlessly looping an auto-starting script. This has led to my Brucontrol application freezing and becoming unresponsive which requires me to kill it with Task Manager...similar to what @oakbarn experienced here: https://www.homebrewtalk.com/thread...trol-automation-software.624198/post-10284815
Can someone tell me where the scripts are saved within the app so that I can remove it and get back to breaking something else?
You can restore an old config file (e.g., yesterday's). But that'll undo any changes you made in BC since that file was created.

Alternatively, make a copy of the current default.brucfg version as a fresh backup, and then in the original file, search in for the code element. Make sure to remove all open and closing tags related to it.

Looking at mine, you'd likely want to start with the <Process> tag before that code and end at the </Process>.

You could also just resolve the issue by inserting "sleep 1000" for it to not do the spiral death loop (that gives you a 1 second delay), or use "wait" if it can be event-based.
 
Back
Top