IoT Demo 2k18: Making of the Bin Unit PCB

Hi there! It’s time for another update; the Bin Unit board traversed from theory to practice!

Thus, here is a small summary about making the PCB, putting the parts on it, writing the firmware and finally testing it!


First of all, the board had to be milled out as intended in the Bin Unit Board Design:


But after having cut out some boards (one machine needs 4 of them and building two identical machines plus some extra units for experimenting / spare parts), it turned out that some had acceptable quality but some had not:

Acceptable quality
Unacceptable quality

The not-so-good versions had frayed / fuzzy contours, in turn looking ugly and needing some substantial reworking after milling in order to get usable: The small frayed copper flakes needed to be scratched out with a needle in order to not get the reason for accidental short-circuits between the circuit paths. This rework was not necessary for the better ones. After some searching in the web, there came up a small hint that this effect can be observed on dull engraver pins (but mine were virtually new!) or if they are not mounted absolutely concentric. The latter reason turned out to be the one in effect here. As another experience, it turned out that having pins with a 0.2 mm diameter produced more rugged / reliable isolation paths than the 0.1 mm versions. Here is the final result:

So the boards were there now, next step was to solder the parts on it (what is an uneventful task for these boards, having a magnifying glass of course or having an age of << 50). So the first prototype-board (intended for testing) looked like that:



The final manufacturing step was to write and upload the firmware for the microcontroller (an ATtiny2313a): Without the firmware, the board is just only a dumb and useless piece of hardware…
As people in software-business know, the real trouble / complexity most often is in software. So it is here. These micro-controllers can be programmed either in Assembler (when you have time and the project is not too big, i started with that approach once upon a time), or – much better – in plain C, what does not mean to loose any kind of control, performance or flexibility for those controllers:

The Tool-Chain

There is the excellent open source WinAVR environment, having bundled the Gnu C(++) compiler, auxiliary tools and Avrdude for programming / uploading to the chips, potentially using Notepad++ for programming / build automation.
If you want a bit more IDE-feeling, use Atmel Studio 4 (scroll a bit down in the archive til 4.19; that also would be the first choice on Assembler, it also has an inbuilt Simulator): Even as being retired, it still is a good balance between features and footprint. For those (Ma[c|d] Attitude) people basically liking a blinky IDE, all the nasty frills and sidebars around it (and eventually also might do the intended work if time still allows), Atmel Studio 7 would be the state of the art. If it sounds and looks like MS Visual Studio, the reason is that it bases upon it. And as it is state-of-the-art, it has the bloatware attitude of many contemporary software packages these days. Anyway, you can set up a lightweight tool-chain if you want.

The Firmware

…has basically to drive the BYJ-48 Stepper Motor and shall listen to an I2C bus in order to get its instructions and give feedback. Also, it has to keep track of the various (optical) bin level / part counter and motor position sensors and perform the necessary reactions on them. As those ATtiny-famils controllers do not have a fully-fledged I2C interface, but a more generic serial USI interface only, the fully fledged I2C proocol handling has to be built by some bits of software instead. As 99% of mankind (including me) could not scratch-build that just having the datasheet, imagination and logical thinking at hand only, there does exist the appnote AVR-312 that basically shows up how to do it: It is basically the “mother” of all concrete implementations available in the Web.
I decided to use this implementation. But (unfortunately for my workload), i’m pretty often not happy with the things got out-of-the-box: So it also was here. Whereas the tricky I2C groundwork (i was after) was there in a convincing way, the way of exposure unveiled two flaws for me:

  • There was no provisioning for atomic transfers of whole byte-sequences. That imposes a problem even on x-interrupt routine data transfers (versus interrupt routine to main program transfers) and also cannot be circumvented by atomic access to these byte-sequences, as I2C reading / writing definitely happen byte-wise in the particular interrupt-routines. So a kind of atomic handover-mechanism had to be done.
  • The current implementation just only silently reads and writes the I2C prone payload data to variables. Whereas being ideally non-intrusive for the worker-code, it also has no notion of calculating things on-the-fly on retrieval and performing immediate actions / events on receiving of data. Thus, a kind of event-driven handover had to be done.

After some thinking around, the existing implementation easily could be modified / enhanced for these traits. So this I2C handling is THE critical factor in this firmware and covered thereby.
On the other hand side, writing code to drive the stepper motor is pretty straightforward when knowing how such motors work. Doing this simple processing in a tailored manner also is a good insight for more general “make-or-buy” considerations: Whereas for I2C, “buy” (AND customize of course!) was the solution, the “make” clearly is better here (as it can be seen in the Web where clueless Arduino Kiddies struggle with adapting non-perfect-matching Arduino prone libraris to somehow make it work – trouble and outcome speak a clear word about sometimes better getting your own mind).
The firmware also has inbuilt a small test-mode that allows to basically test the work of the sensors and the motor. It is chosen by special I2C-address jumper settings.

In the whole course of development, it was not absolutely clear from the beginning if the 2k of flash memory of the ATtiny2313a could hold the whole code; i did approach the limit dangerously nearby. As an alternative, the compatible “big brother” (ATtiny4313) would have been possible to use (but as i programmed fast, i was faster than memory could be occupied by the processor…). So at the end of the day, 98% of the flash memory was occupied, basically by carefully deciding about what must be in and what not.

The Rollout

Now the time had come where everything should work. In theory. Thus the testing in turn (and be aware that a visual debugger is a not-have in this environment; as like in the pre-GUI days, println() was your friend, so is the auxiliary Led on the board here):

First task here is to bring the firmware to the controller, using a (cheap no-name) USBasp programmer: The board has (got) exposed the special controller-lines for an ISP programming approach.

Board attached to the USBasp programmer

First test-round then was to perform the standalone inbuilt test-cycle in order to figure out sensor-handling (anyway virtually a no-brainer) and motor control:


  • There was a first bug in jumper-evaluation for the test-mode.
  • After fixing it, it showed up that there still were 2 bugs in the motor control (that were easy to fix).
  • Also,¬† (sensor-) switching the motor-modes unveiled that even these optical sensors were prone to bouncing effects, typically only known on solid-state switches. An issue potentially still to follow up on part-counting…

Then, the 2nd test round followed, this time concentrating on the I2C communication interface (as mentioned, THE critical part for this firmware). So in fact, the board implements a I2C-slave that wants to be controlled by an I2C-master. In the final machine, the Head-Unit will become the I2C-master, but the Head-Unit did not yet exist!
So there was the need for a quick (and versatile) I2C-master. For that purpose, the Raspberry PI helped in getting the temporary test-client by using the Linux I2C-Tools package. So the whole hardware was assembled together in order to drive the board then via these tools:


Here is the I2C command set implemented by the board:


Now, the test was performed, trying to access the board from the command prompt, and it did happen …. nothing at all!
Thus. the worst possible debug-case had occurred within hard-to-understand code and having not only the software as the potential source of the problem:

  • First thought was that the I2C clock-stretching effect and the well known flaw in the Raspi concerning this trait could be the reason (but on single-byte access and a high board-speed, that should not be the case). But bringing down the bus-speed from the default 100kHz to 10kHz showed no effect.
  • Next, the critical I2C interfacing of the board (running with 5 volts) to the Raspi (running with 3.3 volts) was considered reason: in this special master-slave situation and knowing how the bus works, these different voltages at least pose no risk to damage the Pi. But potentially incompatible logical levels being out of tolerance still could make things inoperable. So the mitigation was to operate the board as well on 3.3 volts in order to exclude this error option. The only problem was that the controller by using the external 16MHz clock was out of spec to run reliably on that low voltage (~4V are the threshold on that clock speed). So it was necessary to temporarily reprogram it to use the inbuilt 8MHz clock only, thereby making it usable for that lower voltage. For that purpose, the internal speed-step down of the firmware has to be adjusted as well so that the effective timings remained the same. But this test also brought no improvement…
  • Next, there was the doubt that the I2C bus of my Raspi could have been damaged (as there was the mentioned x-voltage I2C-coupling what could impede a risk for the Pi depending on the way the I2C-slave code behaves – searching the code did not yield theoretical bullet-proofness to this extent!). So it was time to attach a “real” I2C slave in order to test the proper operation of the bus. A small I2C A/D-converter ADSL-1115 (as used in the recent Fridge Demo) still was hanging around there, so i did use it for testing:20180112_084805_small

    It turned out that everything worked using this slave!
    So the overall conclusion was that the Raspi + I2C-tools were working and also the x-voltage coupling did not cause the error. A hardware-error on the board / controller also could be excluded, as these both I2C pins also are used on ISP-programming and always were operative. So what remained to be the reason was the firmware code…

  • A new day, a fresh cup of coffee … and suddenly the feeling to be the dumbest creature in the universe:
    I just simply made a wrong association between the physical I2C address jumper-setting on the board and the logical pin-levels, thereby crafting the device-address. Even worse, some of the selectable addresses also were out of permitted I2C address scope (guess which one was adjusted on the board…). Thus, also the i2cdetect tool did not enlist the board at all, even at an unexpected address!
    So fixing this address handling made the device accessible, even back by reverting it to 5V mode (but still needing to keep the bus-speed down to, say, 10kHz: For those few bytes of data, that is no issue and 10000 Baud are plenty of speed, even being on a non-entry modem level of th early days with 9600bps).
  • Now the I2C communication worked reliably, except writing multiple bytes: A logical error in my enhancement of the I2C client code was the reason for that.
  • So on testing then the whole functionality, the initially observed¬† trait of bouncing for the optical sensors turned out to be a problem for part-counting. But in the micro-controller scene, bouncing is a well known effect, and some people try it to solve brute-force by adding some hardware parts, whereas the typical approach is to target this on software level. So modifying the firmware to cope with bouncing did resolve this issue as to the level of part-counting testing done so far*.
    * This bouncing of optical sensors as well is an important insight / achievement for potential future B1i based projects / customer cases that cope with such sensors, as e.g. bottle-counting and so on…



Being at that stage, the Bin Unit controller circuit is finalized and operative, It works well with a standard 3rd party reference I2C master (the Raspi + I2C-tools). Thus, it also should be able to work well with the future real “head” still to come (whereas running “headless” these days … but don’t even we run headless more often than we’re aware of?).

The first part-counting tests also unveiled that small (metallic) parts as e.g. small screws, nuts and washers cannot be counted reliably with the intended neat optical IR reflective sensors. What works much better are small colored (wooden / plastic) balls / pearls (with approx. 1 cm in diameter) as long as being the color not too dark / dull or extremely glossy. Also, such balls will be ideal candidates in order to achieve a rugged and reliable processing mechanism in the machine.
In practice, it can be a model / showcase for a production process in pharmaceutical packaging (as being one exemplary placeholder for Industry 4.0).

So next station is the building of the Head-Unit and at the same time slowly slewing towards the mechanical (3-D printed) parts of the machine…



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Powered by

Up ↑

%d bloggers like this: