RoastTime 2.0 and IBT

Just purchased new Bullet with IBT. Downloaded RoastTime Beta and have it working on Win 10 PC. When roasting it shows both the bean probe temperature and the IR bean temperature in digital form. It only graphs the bean probe temperature, drum temperature and ROR? Is there a way to graph the IR bean temperature? I thought this measurement was going to be the future of coffee roasting with real time readings and consistent FC/SC temperatures across different batch sizes.

Anyone have any guide regarding what drum speed to use and how drum speed affects the roast?

Thanks

Peter

After charging, the drum temp is the IR bean temp.

Thanks - that makes sense

Wonder where the ROR is derived from: the infrared (IBT) or the probe (BT)?

@bertje1959, as of right now, my understanding is ROR is derived from the probe, not (yet) the IBTS.

1 Like

Kafei (from Aillio) answered this back on 1/29 in this reply.
ROR calculation

Well, this surprises me, as the IBTS measures better and is less dependent of the amount of coffee you put in the roaster. And the IBTS works only with RT2 (isn’t it?), why didn’t they change the algorithm to derive the ROR from the IBTS? But I am confident they will change this in the future. Otherwise the IBTS is rather useless.

I thought about this and am optimistic it will change in the future as well. There could be a few reasons (these are my guesses and based on no direct insight aside from general technology industry experience):

  • The vast majority of Bullets in the “field” don’t have the IBTS yet. The changes they made to the software that swapped the old Drum Temperature reading for the IBTS reading but left everything else the same was likely the fastest way to do the best they could for both types of users.
  • The IBTS is touted as a more accurate reading of the temperature, and removes some charge-size-based variability in milestones that other probes can’t account for. However, it seems that for most of a normal roast’s time, the shape of the two temperature curves stays similar. That’s what ROR is actually reflecting, and since the rate of change when comparing one to the other is close enough then it doesn’t make a real difference which one you use. (If, over a given period, the old temperature probe starts at 100 and ends at 105 while the more accurate IBTS probe starts at 115 and ends at 120, your ROR’s still 5, right?)
1 Like

Yes, right. The curve is almost the same and I can imagine that the programmers made this choice, otherwise the machines without the IBTS will get useless. Perhaps they can make an option in the settings where the ROR is derived from: the IBTS or the beanprobe.

That would be nice, though I suspect whether or not it’s easy depends on a lot we don’t know about how the software (and possibly the firmware) are written. :slightly_smiling_face:

I believe that your thoughts could well be inline and from what I’ve read in Aillio articles, they have a lot of excitement about where IBTS could take them with their roasters. Maybe if they get it implemented well enough, it may be a business opportunity for them as an measurement option for other coffee-roasters in the industry. I’m reading my own tea-leaves a bit here, but who knows?

You can ask every programmer: this will not be a problem. Algorythm stays the same, only the data variables are different. And both data, IBTS and BT, are read from the bullet R1.

It would be nice to have ROR available for both readings and be able to disable them based on what IR sensor you have.

This and adding smoothing would be good enhancements.

Thinking about this some more, another factor that may have been behind leaving the ROR curve as-is and tied to the old temperature probe could be a desire not to introduce too many drastic changes to everyone’s processes at once. And, moving ROR to the IBTS curve would introduce one or two differences that are bound to affect a lot of people.

In switching to IBTS, the “turn” goes away. I’ve seen it argued (and myself believe) that the turn is less a real-thing-that-happens than it is an artifact of less accurate measurement tools. By itself, I’m not sure that’s a big deal but a downstream impact of the turn not going away is that initial ROR is going to drastically change. I know that when I asked for feedback on an initial batch I put through the bullet, some of the first suggestions I got involved shooting for a higher initial ROR. If there are a lot of people who are currently using that figure as a critical signpost in their batches, pulling it out from under them without time to get used to the idea isn’t going to be too well received. This way, folks who have built habits based on the behavior of more traditional temperature probes have time and space to correlate what they’ve always done to what’s different about IBTS for a while before just losing what might be an important early roast focus for them.

As for me, it wouldn’t bother me–this is the first gear I’ve owned that really lets me easily track and consider curve data, and I’m not too used to it yet. I can take that “your initial ROR should be 5 or 6 degrees higher” suggestion, abstract it to “bump up my power setting at the beginning” and be fine…

Yesterday, I ran across this information in the Bullet manual (top of page 17) and had missed it.

“Bullets with the IBTS (V1.5 & V2.0) will by default show the IBTS temperature as the
Bean Temp. The X-LED light above the A button will be ON when the temperature is
from the IBTS and OFF when displaying the bean probe temperature. By pressing the A
button you can toggle between the bean probe and the IBTS.”

That appears to say that if you have at least ver 1.5 we can select between IBTS or the Bean Probe reading for bean temp. I have not played with switching between the (2) sources, but am wondering if the ROR graph adjusts depending upon what bean temp source the user selects from the Bullet control panel?

I’m a fan of seeing “the turn” just as the indicator of when the beans are beginning to take on heat after charging.

I’ve played with the “A” button switching back and forth and I don’t recall seeing any change in the RoR graph. The reading just toggled back and forth to match either the bean temp or drum temp reading in RT. I just leave it on the IBTS reading now and pay little attention to the bean temp from the old probe.

This is for the roaster display and doesn’t affect anything in RoasTime. ROR is still based on the old Bean Temp probe and not the IR sensor.

Ok, thanks for the replies. I’ll hafta play with it myself (just for grins) when I wrap-up the last 2 “seasoning” roasts that I’ll be doing this weekend.

I suspect a lot of folks are, but an ROR that pulls data from a source that doesn’t have the turn is going to be a big change for folks who use it.

There’s also the potential for scaling issues there since the possible range of ROR values is going to be a lot wider. For grins, I went back to a curve I manually built when I was using my Gene that was based on periodically noting the temperature on its display. I intentionally picked one where I didn’t preheat, as that temperature curve then looks the most like a curve we’d get from the Bullet’s IBTS (I used Fahrenheit on the Gene):

15%20AM

Here’s an ROR plot for that roast. This is pretty well inline with what we could expect from an IBTS-based one:

07%20AM

The good news is, the early direction change could give folks who make use of “the turn” a reasonable proxy. The bad news is–besides a different peak value/range that users would need to internalize–that the more extreme changes early would make the ROR change later in the roast, where it’s arguably more important seem to plateau when it’s really not. I’m sure there are ways to manipulate the scale or the display to minimize that, but good luck building consensus on how that should look. :slight_smile: