Pages

Showing posts with label IoT. Show all posts
Showing posts with label IoT. Show all posts

Sunday, July 14, 2019

Robo4J: Adafruit LED Backpacks - guide

Img.1.: Robo4J with Adafruit LED Backpacks
  In the following blog post I will describe how to use the Adafruit LED Backpacks together with Robo4J framework. All mentioned examples are running on Java11+ and as you will see, they are very easy to implement, modify and connect with even more advanced systems.

Inspired by a motto : Plug&Play.  

It's a lot of fun to have all those LEDs under the control and due to Robo4J you don't have to use C++ example, just a pure Java ... 

  The Adafruit is quite great company,  they do provide a lot of easy to implement breakout boards for RaspberryPi platform. One of them are LED Backpacks
  Robo4J does contain  implementations for Bargraphs, Matrixes and Alphanumeric ones. The agenda: 
  1. Bi-Color 24 Bargraph
  2. Bi-Color 8x8 Matrix
  3. 0.54" Alphanumeric
To assemble any of backpacks I'd recommend to go through the well described manuals on the official Adafruit web pages

example requirements:
1. RaspberryPi3 B+ 
2. Java 11 (example: Bell-Soft Liberica 11+)
3. one of Adafruit LED Backpack 

  Before we enjoy a LED shows, let's take a quick look how all is bundled. It should give you an idea how to connect LEDs in other than discussed cases. 
  As you know Robo4J is a light weight messaging framework. It's separated to the dedicated modules according to the purpose. In following blog post we do use robo4j-core, robo4j-units-rpi and robo4j-hw-rpi as you can see in the diagram bellow. The all three mentioned I2C examples are by default configured as following: 


int bus = I2CBus.BUS_1;
int address =  0x70;\

Important: If you intent to use only the robo4j-hw-rpi, it means only the hardware, the module can be used alone without any additional dependencies.
Img.2.: Robo4J Adafruit backpack schema

Let's suppose we have all assembled and we start with the 1st Bi-Color 24 Bargraph.

1. Bi-Color 24 Bargraph
Img.3.: Bi-Color24 Bargraph example
We have connected the device over the I2C interface to the RaspberryPi according to the Adafruit Documentation. As the first step we test the hardware, let's create a simple example that uses only the robo4j-hw-rpi module. :


public class BiColor24BargraphExample {
  public static void main(String[] args) throws Exception {
    BiColor24BarDevice device = new BiColor24BarDevice();
device.clear();
device.display();

int counter = 0;

while(counter < 3){
  for (int i=0; i<12; i++){
    int colorNumber = (i+counter) % 3 + 1;
    XYElement element = new XYElement(i, BiColor.getByValue(colorNumber));
    device.addBar(element);
    TimeUnit.MILLISECONDS.sleep(200);
    device.display();
  }
  counter++;
}
  }
}
The code above (Img.1.) shows how the elements are created and send to the Robo4j hardware unit. 
The next example snippet shows how the Robo4J hardware is exposed to the units from robo4j-units-rpi module. The following module offers more advanced features upon the hardware, like command or continual operations, which allows more advanced usages.


RoboContext ctx = new RoboBuilder().add(settings).build();
ctx.start();
RoboReference<LEDBackpackMessage> barUnit = ctx.getReference("bargraph");
LEDBackpackMessage<XYElement> clearMessage = new LEDBackpackMessage<>();
AtomicInteger position = new AtomicInteger();
executor.scheduleAtFixedRate(() -> {
  if (position.get() > BiColor24BarDevice.MAX_BARS - 1) {
    position.set(0);
  }
  barUnit.sendMessage(clearMessage);
  XYElement element = new XYElement(
    position.getAndIncrement(),
    BiColor.getByValue(position.get() % 3 + 1));
  LEDBackpackMessage<XYElement> addMessage = new LEDBackpackMessage<>(LEDBackpackMessageType.DISPLAY);
  addMessage.addElement(element);
  barUnit.sendMessage(addMessage);


}, 2, 1, TimeUnit.SECONDS);

Congratulation the 1.st LED Backpack is in use and running! 

2.Bi-Color 8x8 Matrix
Img.4.: Bi-Color 8x8 Matrix Face example
The 2nd example is Bi-Color 8x8 LED Matrix. It is really a lot of fun as you can display a simple images or animations. The Matrix is provided exactly in the same manner as the previous example. The hardware can be use only by robo4j-hw-rpi module to make hardware testing simpler, quicker and more fun ;). Here is the simple code snippet:


char[] faceSmile = "00333300,03000030,30300303,30000003,
    30300303,30033003,03000030,00333300".toCharArray();
...
List<char[]> availableFaces = Arrays.asList(faceSad, faceNeutral, faceSmile);
for (char[] face : availableFaces) {
  matrix.clear();
  matrix.display();
  byte[] faceByte = LEDBackpackUtils.createMatrixBiColorArrayByCharSequence(
    matrix.getMatrixSize(), ',',face);
    XYElement[] faceElements = LEDBackpackUtils.createMatrixByBiColorByteArray(matrix.getMatrixSize(), faceByte);
    matrix.addPixels(faceElements);
    matrix.display();
    TimeUnit.SECONDS.sleep(1);
}
  
Having the hardware tested we can move the the robo4j-units-rpi module and enjoy a bit advanced hardware usages. Now we have an opportunity to connect the units with another ones and create a more advanced system based on messages passing, see a example snippet bellow:


RoboContext ctx = new RoboBuilder().add(settings).build();
ctx.start();
RoboReference<LEDBackpackMessage> barUnit = ctx.getReference("matrix");
LEDBackpackMessage<XYElement> clearMessage = new LEDBackpackMessage<>();
AtomicInteger position = new AtomicInteger();
executor.scheduleAtFixedRate(() -> {
   if (position.get() > 7) {
   position.set(0); }
  barUnit.sendMessage(clearMessage);
  XYElement element = new XYElement(
    position.get(), position.getAndIncrement(),
    BiColor.getByValue(position.get() % 3 + 1));
    LEDBackpackMessage<XYElement> addMessage = 
        new LEDBackpackMessage<>(LEDBackpackMessageType.DISPLAY);
    addMessage.addElement(element);
    barUnit.sendMessage(addMessage);

}, 2, 1, TimeUnit.SECONDS);


3. 0.54" Alphanumeric
Img.5.: Adafruit Alphanumeric LED Backpack
 The previous examples were only about to just turning on/off a specific LED diodes. The last example shows how to display ASCII Characters on 14-segments (Img.6.) display (see wiki). The value can be transmitted as the ASCII character of specific segments can be turned off/on as is visible from the picture above (Img.5.). 
Img.6.: 14-segments LED (source: Adafruit)
The robo4j-hw-rpi module provides an example that shows how to configure and run Adafruit hardware:

AlphanumericDevice device = new AlphanumericDevice();
device.clear();
device.display();
device.addCharacter('R', false);
device.addCharacter('O', true);
device.addCharacter('B', false);
device.addByteValue((short) 0x3FFF, true);
device.display();

It looks very simple. It's possible to add the value by the character reference on by its 16-bit value. The module robo4j-units-rpi, similar as in previous cases, provides a advanced features that can be used for controlling the Alphanumeric LED Backpack, see the code bellow: 

RoboContext ctx = new RoboBuilder().add(settings).build();
ctx.start();
RoboReference<LEDBackpackMessage> alphaUnit = ctx.getReference("alphanumeric");
LEDBackpackMessage<AsciElement> clearMessage = new LEDBackpackMessage<>();
LEDBackpackMessage<AsciElement> displayMessage = new LEDBackpackMessage<>(LEDBackpackMessageType.DISPLAY);
AtomicInteger textPosition = new AtomicInteger();
executor.scheduleAtFixedRate(() -> {
    if(textPosition.getAndIncrement() >= MESSAGE.length - 1){
        textPosition.set(0);
    }
    alphaUnit.sendMessage(clearMessage);
    LEDBackpackMessage<AsciElement> messageAdd = new LEDBackpackMessage<>(LEDBackpackMessageType.ADD);
    char currentChar =  MESSAGE[textPosition.get()];
    adjustBuffer(currentChar);
    messageAdd.addElement(new AsciElement(0, BUFFER[0], false));
    messageAdd.addElement(new AsciElement(1, BUFFER[1], false));
    messageAdd.addElement(new AsciElement(2, BUFFER[2], false));
    messageAdd.addElement(new AsciElement(3, BUFFER[3], false));
    alphaUnit.sendMessage(messageAdd);
    alphaUnit.sendMessage(displayMessage);
}, 1, 500, TimeUnit.MILLISECONDS);

The final words and conclusion
 The Robo4j framework delivers easy to use hardware abstractions. Such abstractions are exposed through the Robo4j units. The Units provide an additional functionalities (example: simple matrix operation, bargraph led selection etc.) The Robo4j Units allow to employ scheduled or time based operations. All units can be extended about any features/possibilities provided by the Java Ecosystem. 

The Robo4j Adafruit LED Backpack implementation gives a power to create any LED show.
Enjoy and Happy Coding ! 

Monday, October 24, 2016

Robo4j introduction to the reactive "real-time" micro-services based IoT system development

  
Robo4j testing system : camera unit, ultra-sonic, gyro, color and touch sensors. Whole system contains also one  Raspberry Pi, Lego Brick and CuBox i4Pro (4-core CPU) with 500GB hard-drive as re-usable data-storage. Whole system is powered by 2x2500mAh and one 25000mAh unit.
  What is going to be discussed in this article is the high level perspective of the upcoming robo4j.io framework and the Internet Of Things (IoT) system design, so let’s move forward.
  Over the last few years has been one of the most emerging topic micro-services (and still is) followed by reactive manifesto, which explains how to design non-blocking highly responsive, resilient, elastic and message driven parts of the application. From my perspective, all those articles have been focused on the web applications of middle and big sizes. Most of the articles have been describing the best practices about how to split the monolithic systems or how to employ the news trends in micro-services and non-blocking design patterns. 
  Only few of them have been touching other emerging technologies around Internet of Things (IoT) in the way I’ve been searching for. 
  Over the years, reading many of them, I got a strong feeling that the way how we intend to develop IoT systems is, that we take already developed robust technologies and we make them smaller.  These days we have on the market good examples of the successes, we do use Spring-boot, Netty, Vert.x or others.  Those frameworks allow us basically to connect the developed system with different types of databases, analysed and react on events or data, obtained from the data storages. 
The question is still there,  is this really enough ? 
Is such kind of abstraction good enough that it could allows us to develop robust IoT systems ? 
  The IoT systems that can react on incoming events, sorts them properly and in special cases forwards them to the proper system parts, which could process and execute such events ? 
  I don’t think so or I’ve not found anything like this on the market.

  We can go further in questioning ourselves about the IoT possibilities, which can quite soon turn into the Artificial Intelligence discussion. In my opinion, the AI question is not relevant for the article, because even most of the currently available IoT systems can’t still exhibit its intelligence in the manner of intelligent (independent) decisions. This question is reserved for another IoT focused article. 
  Let’s go back to the main topic: advanced Internet of Things system development and design. Before we start here a small review of the definition taken from Wikipedia: 
   “The Internet of Things (IoT) is the internetworking of physical devices, vehicles, buildings and other items—embedded with electronics, software, sensors, actuators, and network connectivity that enable these objects to collect and exchange data”. In other words IoT are objects that are connected to the internet or other networks and those objects are allowed to receive and process data. 

   The definition looks pretty cool and good so far, until you try to use some of those connected “smart” objects together. Now is the IoT system the collection of the independent IoT units and suddenly nothing works as expected due to connection instability, communicaiton protocols message ordering etc. 
  Although such a system has been assembled from cutting edge parts, altogether they simply don’t work, due to previously touched issues or other ones we are not even aware of. The whole development may turn into the really FUZZY mode.  
Such a style of development I normally call try, push, deploy development (TPDD) but this style may not fully satisfy the expectations or needs, it simply doesn’t work in a long term perspective. PS: there is still a chance that I may be wrong.
The described situation can become really frustrating and stressful, especially when there are not so many possibilities on the market. You may use ROS (Robot Operating System) but … let’s go back.

  The lack of proper tools on the market was main motivation to develop robo4j.io framework which satisfies the following needs: 
  1. The robo4j.io is capable to synchronise incoming asynchronous events/messages under the dynamically changing conditions at “real-time” . 
  2. The robo4j.io framework should be able to allow and enforce rapid micro-sevices based system development.
  3. The robo4j.io should be a light framework that is capable to run on any hardware supported by JVM. It means on any Java Powered device and it provides full performance
  4. The robo4j.io should be independent, not hardware specific, it must be flexible to extend
  5. The robo4j.io should have a control over all running units inside the robo4j.io eco-system 
  6. The robo4.io should enforce communication with external systems
  7. The robo4j.io should be easy to use
   All those seven mentioned points are really challenging. Any of them is trivial as it may seem.  I’ve taken this challenge and I’ve developed such a tool that satisfies all of them. I’ve developed already the mentioned robo4j.io framework.
  The robo4j.io framework allows to connect and control any device that is accessible over the specific type of network (Ethernet, WiFi, Bluetooth etc.) or wire. And moreover,  robo4j.io enforce the usage of any of technology running on JVM with its natural language (Scala, Groovy etc.).
   
  Now it’s time to take a look at the basic framework architecture from a high level perspective which is the main intent of this article. The image below shows the high perspective reaction between the different parts inside the robo4j eco-system. It’s important to see the differences between the inputs to the system.
Robo4j framework high-perspective architecture
  Such input could be provided by Sensors, Motors (Engines) or RoboSockets types. The incoming messages is then processed by the system itself and serialised in the manner that the result can be executed on the connected hardware.
Inside the message process stage the system may use different types of datastorages that are available to a specific RoboUnit. This means, that each RoboUnit may have or not it’s specific data resource or each RoboUnit may share them with another one in reactive, non-blocking manner. 
  The Robo4j Message Bus is not dependent on the order, how any specific message is processed. The bus consumes the messages in order of their arrival. The bus may dynamically change the conditions in real time. The pre-results are then moved into the MessageReactors parts. Those parts have again connections to the datastorages, where their results can be stored. The final result can be executed on the real hardware or can involve the whole system state inside the predefined cloud. 
  Finally we are at the end of the motivaiton article about robo4j.io framework. The whole text should motivate you to use it, because the Internet of Things is the future of IT. 
The future of IoT includes the communicaiton security, different kinds of messages processing and reaction on them. The Robo4j framework is supposed to be part of this future.
Stay tuned.
Robo4j IoT system (robot) is discovering a park



   

Saturday, June 4, 2016

2016.06.04 :: Robo4j.io alfa release is out (Internet of Things are powerful)

   Finally, after few months of development today 2016.06.04 Robo4j.io has been published in alfa-0.1 version. I've prepared light version of the framework as the kick-off to upcoming new features, I've already developed or they are going to be added. 
  I've mentioned in one of previous posts what kind of functionality will be included: 

  • robo4j-core
    •  core functionality which provides communication with robot, synchronisation of asynchronous events and etc.
  • robo4j-line
    • command line interface which is capable to dynamically extend robot behaviour
   It has been also announced that the first alfa release will be focused on Lego Mindstorms EV3. 
  Currently I'm working on documentation that anyone who has been following robot build instruction will be able to run Robo4j.io by himself without any problems.

Enjoy and stay tuned! 

Saturday, May 28, 2016

Robo4.io :: "real-time" - synchronisation of asynchronous events/tasks (coming)

   As far as I can go back in my memories, my interests have been always touching robotics or artificial intelligence branches. I've spent an uncountable number of hours investigating and studying such possibilities, developing "simple" systems, designing applications or reviewing books (focused on data mining, machine learning or application development itself). 
I'm pretty sure it helped me to figure out how Machine-to-Machine (M2M) communication may look like.
  The machine seems to be well defined, but what is actually "a machine" ? Good question, isn't it ? We can define it as a system which creates some action. May "machine" be a sensor base that produces some stream of signals ? I think yes, so let's keep it as the variable . On the other hand the term "end-user" is a little more specific. 
Now we have two terms Machine and End-User. 
Machine and End-User example. What is Machine, End-User and System?
   Machines can communicate with other machines. Some of them have access to the end-user/s. Both of them (machine and end-user) are able to produce an asynchronously mass of events/signals that need to be synchronised and processed at some point of time. Moreover, both of them are "systems".

   One night, I've been working on an concurrent parallel application and suddenly a nice came up to my mind: robo4j.io. For a long time I've wanted to create a framework that allows me to synchronise asynchronous events/signal  and process them and here it is:

   I've created robo4j.io framework

Robo4j.io makes my dream real by connecting all devices (CuBox, RaspberryPI, Lego etc.) to different systems. 
   The important connection point between all of those systems is Java, moreover Java Virtual Machine as I'm using not only Java. 

   Let's see how robo4j.io will perform :)