Monday, August 18, 2014

Ixonos Industrial Internet Suite Goes Cloud

During the last months our R&D team has been hard at work creating a complete “data-pipe” from sensors to cloud. By combining some of our existing components (like Wireless sensor data collection with BTLE, Interactive embedded touch GUIs with HTML5 and Ixonos sensact library) and creating new ones we now have a solution in place, called Ixonos Industrial Internet Suite.

Basic architecture is illustrated in the picture below. Basically we utilize our sensact libraries running on Linux to collect the data, secure websocket connection to create local Human Machine Interface views on mobile devices and ship the data to Ixonos Elastic Cloud.


Our UI framework choice for everything is HTML5. This makes our solutions run on different platforms with minimal customisation. Of course we do some tweaking to make the HTML5 apps run smooth and stable on hardware with limited resources.

Meanwhile our fellows at Ixonos Design Studios have been working magic and creating a complete facelift for the UX of the different solutions. With these guys even industrial automation can be fun and easy to use.

Take a look the video below for visualisation of the system:



Got interested? Check out more info and contacts at: http://www.ixonos.com/business-areas/industrial-internet

Jukka Hornborg, Head of Offering Management, Ixonos Plc

Thursday, March 6, 2014

Wireless Sensor Data Collection with BTLE

Bluetooth Low Energy protocol, also known as Bluetooth 4.0 or Bluetooth Smart, is a hot topic right now. At Ixonos, we have been working with it for a while and one of the examples is the addition of Texas Instruments SensorTag support to our libsensact library.

TI SensorTag is a small Bluetooth Low Energy device, which has 6 different sensors and runs on a coin cell battery with very small current consumption.

The applications, which use libsensact, are able to set TI SensorTag as one of the sensors where to connect and read the sensor data from it.

The code for connection to TI SensorTag is similar to the code needed for USB sensors in the earlier example. The difference is that you need to define the BTLE addresses of the devices instead of USB ids:


/* List of supported sensor/actuator devices */

struct ble_sensortag_config_t ble_sensortag0_config =
{
      .ble_address = "BC:6A:29:C3:3C:79",
};

struct ble_sensortag_config_t ble_sensortag1_config =
{
      .ble_address = "BC:6A:29:AB:41:36",
};

struct sa_device_t devices[] =
{
   {  .name = "ble_sensortag0",
      .description = "TI sensortag 0",
      .backend = "ble_sensortag",
      .config = &ble_sensortag0_config },

   {  .name = "ble_sensortag1",
      .description = "TI sensortag 1",
      .backend = "ble_sensortag",
      .config = &ble_sensortag1_config },

   { }
};

int main(void)
{
    int sensortag0;
    /* … */
    sensortag0 = sa_connect("ble_sensortag0");
    /* … */
}
 
The video shows the code and TI SensorTag in action on our HTML5 demonstrator prototype.



Now that the basic BTLE -support is in place we have an easy solution for bringing wireless sensors within our sensor framework. Adding support for new BTLE sensors is pretty straightforward with our scalable architecture.

Stay tuned for further updates as we are combining this and other components into the Ixonos Human Machine Interface -solution to be launched in near future!

Tero Koskinen, Senior SW Designer - Ixonos
Petteri Tikander, Senior SW Designer - Ixonos
 

Friday, January 31, 2014

Ixonos Multi-Display for Android 4.4.2 with Miracast

The Ixonos Multi-Display solution has, since the previous post, been ported to Android 4.4.2 and a few features has been added in the process. It is truly medium agnostic and works over 'whatever medium' supported by the DisplayManagerService (i.e. MHL/HDMI/MiraCast etc.) of the platform. Finally applications can be moved between displays through the 'recents' menu.

The solution addresses the limitations of the Android platform when it comes to multitasking and running several apps in parallel on different displays.

The below video shows a Nexus 10 tablet running Android 4.4.2 initially connected to a TV via HDMI playing an action flying game. New input methods like track pad and game controller has been added to the System UI to provide input for the external display. This provides mouse- and game controller input events, enabling all games that supports the standard Android game controller API to be controlled from the tablet.

At a later point in time, the tablet is connected wirelessly to the TV using Miracast via a Netgear Push2TV display adapter. This enables a true cordless Multi-Display experience where users can enjoy content on a secondary screen without the hazzle of cables.



Vasile Popescu, Chief Software Engineer - Ixonos
Mikkel Christensen, Chief Software Engineer - Ixonos
Martin Siegumfeldt, Chief Software Engineer - Ixonos
Jakob Jepsen, Chief Software Engineer - Ixonos

Friday, January 3, 2014

Ixonos Goes "Imaging Tampere Get-Together"

Companies with a presence in Tampere, Finland have started movement towards making the region a center for imaging expertise, which means focusing efforts in pattern recognition, image enhancement, augmented reality etc. With this in mind, a get-together event was held in late November, and Ixonos with its bright and enthusiastic engineers had to be there too! Other participants included many participants from the Tampere University of Technology, Intel and several startups and older players in the fields of video surveillance etc.

Instead of just showing up with a stack of callcards, though, we decided to amuse the crowd by whipping up a special demonstration software running on the Intel MinnowBoard. It turned out well, and was much loved by the participants.

Ixonos Imaging Demo The system consist of a Playstation 3 camera attached to a MinnowBoard, along with a display for visualising the imaging algorithm results.  MinnowBoard is a small and low cost embedded platform using Intel® Atom™CPU. In addition, a racing track playset with two electric cars was used as the pattern recognition problem. The software consists of the Ixonos Embedded Linux BSP (base support package) , the OpenCV imaging library and a very simple application that tracks two cars on the racing track, calculating their lap times and counts.
Minnowboard (at the back), PS3 camera, racing track!
Car recognition is done by simple color segmentation. The colors are preset, and blobs of certain color are recognised with the OpenCV routine . The centroid of each blob is then visualised on the screen, and their passage over the "startline" is tracked. Very simple. Not a display of our pattern recognition algorithm abilities (call us if that is what you want), but rather of our ability to quickly integrate a complete system where we could later drop a specialised algorithm into. And fun. The purpose was to have fun!

More detailed image processing steps:
  1. Capture image frames (640x480)
  2. Resize frames down to 320x240
  3. Blur to reduce noise
  4. Convert from BGR to HSV color space
  5. Apply filtering thresholds and create binary image
  6. Use moments to calculate the position of the center of the object
  7. Use coordinates to track the object and apply tracking visualizations on top of the image
  8. Display frames with tracking visualizations




The proud author (Ilkka Aulomaa) of the playset car recognition system
About the authors Ilkka Aulomaa, M.Sc. - author of the car recognition system software and setup Mikael Laine, M.Sc. - author of this blogpost, and participator "in spirit" in creating the demonstrator (which means sitting on a sofa and making smart ass comments). He has has written his Master's thesis under the title "On Optical Character Recognition on Mobile Devices" (later published as "A Standalone OCR System for Mobile Cameraphones" in the proceedings of 2006 IEEE 17th International Symposium on Personal, Indoor and Mobile Radio Communications. He has also participated in research in the field of pattern recognition.

Friday, November 29, 2013

Interactive Embedded Touch GUIs with HTML5

Recently we've been considering graphical user interfaces (GUIs) from the point of view of a systems integrator. There are several thing to consider, when creating a complete solution, such as: several different software platforms ( embedded devices, phones, tablets, desktop computers, ...), data network considerations and future proofing.

Several technical solutions come to the rescue here. Firstly, there are standards that span several (all involved) platforms and allow software development to be done once - with perhaps some adaptation for each platform. Secondly, networks of all sizes and shapes allow for powerful distributed systems, where data can be shared and interaction happens across the room or from the other side of the globe.

The Ixonos Embedded HTML5 library - ixgui.js - has proven to be a highly flexible and scalable platform for creating embedded GUIs. Recently, a number of system topologies have been explored using ixgui.js, involving running the GUI as detached from the embedded device. HTML5 obviously fits natively into this kind of distributed environment. The GUI can be hosted on the cloud, on an embedded device or basically anywhere.

Sensor data sharing in our demonstrator is fascilitated using the Ixonos sensact library, which you can read about in an earlier blog post.

The user interface for this demo is simple. It displays data coming in from the TI Sensor Hub Booster Pack. In addition, there is an RPM display and setting slider, but that is only for show: there is no motor in this version - but in later ones there will. below is a screenshot of the GUI:

Simple Touch Interface using ixgui.js
The below video illustrates using this GUI on the Texas Instruments AM3359 Evaluation Module with a separate, more elaborate, GUI running on a detached display.

ixgui.js is a HTML5-based GUI library, which allows performance optimized GUI creation by using the Canvas 2D interface for fast graphics and fine control over what is drawn at a given time. It is designed around the principles of simplicity, performance, standards compliance and programmer friendliness.

This article outlines some key methods for improving Canvas 2D performance. It has been extremely gratifying to fine-tune drawing for ixgui.js, and indeed we implement optimization on several levels.

On the top level, rendering is optimized by only drawing what needs to be redrawn. For most GUIs, only when items are interacted with, do they require to be redrawn.

Pre-rendering: often a large part of an item is static, and actually requires no update at all during the entire lifecycle of an application. In these cases, we can simply pre-render those areas that don't change onto a separate buffer, and reuse that for each redraw. As an example, see the below picture for how the vertical sliders in the demo are drawn:

Finally, at the lowest level possible (in JavaScript), some optimization is achieved by only feeding integer values to drawing routines. All coordinates and dimensions throughout the GUI are cast to integers.

Mikael Laine, SW Specialist - Ixonos

Friday, November 8, 2013

Ixonos Multi-Display for Android

Ixonos enables its Multi-Display feature for recent generation Android - making multi-tasking easy.

Have you ever tried using the Android secondary display API's (described here) that was introduced in Android Jellybean 4.2? Using the "Presentation" class to show content from your app is quite cool, but you are still limited to to run only one activity at a time. Basically you launch a Dialog (Presentation) to the secondary display from your activity running on the main display. This is useful for certain types of apps like e.g image and Powerpoint presenters, but what about running the stock Android Browser on one display and watching YouTube on the other?

Watch the video below and see what Ixonos has created to enable true multitasking for multiple displays.


This technology is a generic solution that enables the user to run existing Android applications on either display and also to map external input devices to the given display. Also, it utilizes the new Android display manager service and is thus display agnostic, meaning that we can use any type of display, eg. HDMI or Miracast. The Multi-Display feature can be integrated with recent generation Androids (4.2, 4.3, 4.4) by our engineers.


Vasile Popescu, Chief Software Engineer - Ixonos
Mikkel Christensen, Chief Software Engineer - Ixonos
Henrik Kai, Chief Software Engineer - Ixonos 

Friday, October 25, 2013

Build Gear version 0.9.19beta released!

A new version of Build Gear has recently been released.

A lightweight embedded firmware build tool

Build Gear is the open source build tool that is used to build the Ixonos Embedded Linux BSP for various embedded boards based on a range of different chipsets including TI OMAP/AM/DM, Freescale IMX, Intel Atom/Haswell, etc.. This build tool allows us to very effectively create and maintain clean cut modern Linux BSP firmware tailored to fulfil the requirements of individual embedded customers.

This release includes a couple of new features and some bug fixes.

One of the new interesting features is the introduction of a new command to create a software manifest which provides a detailed list of the software components involved in a particular build. This is a quite useful feature in case you need an overview of the licenses of the components going into your firmware. Actually, for most this is an important feature so that the BSP firmware can be legally approved before going to production.

For more details see the release announcement here

The Build Gear tool has been in beta stage for quite some time but it has now stabilized to the point where it is ready to move out of beta. Thus, it will soon be labelled stable and a 1.0 release will mark the final transition out of beta.

Expect more posts from me on this build tool and on how and why we use it to create the Ixonos Embedded Linux BSP platform solution.

Keep it simple!