Google’s new IoT Core service helps businesses manage their IoT data and devices

Internet of Things connecting in cloud over city scape. Google Cloud launched a new Internet of Things management service today called Google Cloud IoT Core that provides a way for companies to manage IoT devices and process data being generated by those devices. A transportation or logistics firm, for example, could use this service to collect data from its vehicles and combine it with other information like weather, traffic and demand to place… Read More


Microsoft launches new IoT services for the enterprise

 Microsoft is launching IoT Central today, a new Internet of Things (IoT) service that gives enterprises a fully managed solution for setting up their IoT deployments without needing the in-house expertise necessary for deploying a cloud-based IoT solution from scratch. It’s basically IoT-as-a-Service. Read More


A year after Jasper acquisition Cisco expands the platform

cisco Just over a year ago Cisco bought Jasper Technologies for $1.4 billion, and with that transaction, created the company’s IoT cloud business. This week, Cisco is making a series of announcements at Mobile World Congress related to expanding the platform. When a startup like Jasper gets acquired by a big company like Cisco, the standard line of reasoning goes something like this:… Read More


MobileIron adding IoT management to its arsenal

Internet of Things connecting in cloud over city scape. MobileIron, which went public in 2014, has been known mostly for helping large companies manage mobile devices, especially in a time when people tend to bring their own. Today it announced it was expanding that mission to the Internet of Things.
When you think about it, it’s a logical move for a company that is used to overseeing a large number of devices and helping IT keep them secure. Read More


Soracom launches SIM cards and services easing IoT rollout cost

One of Soracom's case studies is Farmnote, "a solution that involves attaching a sensor to heads of cattle and polling data on their activity." One of the challenges of Internet of Things is the poor availability and high price of mobile data. Twilio announced it was going to have a stab at changing that earlier this year. Today Soracom showed off its solutions to ease the pain of IoT adaptation across the industry. The company’s slightly byzantine pricing plans are optimized for a number of low-bandwidth, high uptime applications. Read More


GE Digital snags ServiceMax for $915 million

GE and ServiceMax joining together. GE Digital announced today it was buying ServiceMax, a cloud-based field service management company, for $915 million. GE Ventures had been an investor in the company as part of the $82 million Series F round in 2015. The two companies have more in common than you might think. GE is building the Predix Platform, on top of which it hopes companies will build applications that take advantage… Read More

-- launches an IFTTT for business users

flowexpress-overview-pg-banner-2c With Flow, has long offered an integration tool that allowed technical users (think IT admins and developers) to create complex, multi-step integrations with the help of an easy to use drag-and-drop interface. Now, however, the company has also launched a more basic version of Flow that is aimed at business users who want to create IFTTT-like integrations between applications like… Read More


Softbank has completed its £24B cash acquisition of ARM Holdings

IOTGlobe One of the biggest tech deals this year — and the biggest ever in the UK — has now closed. Today, Softbank announced that it has completed its acquisition of ARM Holdings, the semiconductor firm that it said in July it would acquire for £24 billion in cash (around $32 billion in today’s currency, $31 billion at the time of the deal), in order to make a big jump into IoT. As… Read More


Autodesk looks to future with investments in 3D robotic drones and IoT as a service

3D Robotics Drone When you think of Autodesk, you probably think of desktop software and traditional manufacturing, but the company is trying hard to change that perception, and today it announced the first three investments from its $100 million Forge Fund, which includes a 3D robotics drone company, an on-demand machine shop service and a platform for building smart connected Internet of Things devices… Read More


Using MySQL 5.7 Document Store with Internet of Things (IoT)

MySQL 5.7 Document Store

MySQL 5.7 Document StoreIn this blog post, I’ll discuss how to use MySQL 5.7 Document Store to track data from Internet of Things (IoT) devices.

Using JSON in MySQL 5.7

In my previous blog post, I’ve looked into MySQL 5.7.12 Document Store. This is a brand new feature in MySQL 5.7, and many people are asking when do I need or want to use the JSON or Document Store interface?

Storing data in JSON may be quite useful in some cases, for example:

  • You already have a JSON (i.e., from external feeds) and need to store it anyway. Using the JSON datatype will be more convenient and more efficient.
  • For the Internet of Things, specifically, when storing events from sensors: some sensors may send only temperature data, some may send temperature, humidity and light (but light information is only recorded during the day), etc. Storing it in JSON format may be more convenient in that you don’t have to declare all possible fields in advance, and do not have to run “alter table” if a new sensor starts sending new types of data.

Internet of Things

In this blog post, I will show an example of storing an event stream from Particle Photon. Last time I created a device to measure light and temperature and stored the results in MySQL. provides the ability to use its own MQTT server and publish events with:

Spark.publish("temperature", String(temperature));
Spark.publish("humidity", String(humidity));
Spark.publish("light", String(light));

Then, I wanted to “subscribe” to my events and insert those into MySQL (for further analysis). As we have three different metrics for the same device, we have two basic options:

  1. Use a field per metric and create something like this: device_id int, temperature double, humidity double, light double
  2. Use a record per metric and have something like this: device_id int, event_name varchar(255), event_data text (please see this Internet of Things, Messaging and MySQL blog post for more details)

The first option above is not flexible. If my device starts measuring the soil temperature, I will have to “alter table add column”.

Option two is better in this regard, but I may significantly increase the table size as I have to store the name as a string for each measurement. In addition, some devices may send more complex metrics (i.e., latitude and longitude).

In this case, using JSON for storing metrics can be a better option. In this case, I’ve also decided to try Document Store as well.

First, we will need to enable X Plugin and setup the NodeJS / connector. Here are the steps required:

  1. Enable X Plugin in MySQL 5.7.12+, which uses a different port (33060 by default)
  2. Download and install NodeJS (>4.2) and mysql-connector-nodejs-1.0.2.tar.gz (follow the Getting Started with Connector/Node.JS guide).
    # node --version
    # wget
    # npm install mysql-connector-nodejs-1.0.2.tar.gz

    Please note: on older systems you will probably need to upgrade the nodejs version (follow the Installing Node.js via package manager guide).

Storing Events from Sensors provides you with an API that allows you to subscribe to all public events (“events” are what sensors send). The API is for NodeJS, which is really convenient as we can use NodeJS for MySQL 5.7.12 Document Store as well.

To use the Particle API, install the particle-api-js module:

$ npm install particle-api-js

I’ve created the following NodeJS code to subscribe to all public events, and then add the data (in JSON format) to a document store:

var mysqlx = require('mysqlx');
var Particle = require('particle-api-js');
var particle = new Particle();
var token = '<place your token here>'
var mySession =
    host: 'localhost',
    port: 33060,
    dbUser: 'root',
    dbPassword: '<place your pass here>'
process.on('SIGINT', function() {
    console.log("Caught interrupt signal. Exiting...");
particle.getEventStream({ auth: token}).then(function(stream) {
                stream.on('event', function(data) {
                                mySession.then(session => {
                                                .add(  data  )
                                                .execute(function (row) {
                                                        // can log something here
                                                }).catch(err => {
                                                .then( function (notices) {
                                                        console.log("Wrote to MySQL: " + JSON.stringify(notices))
                                }).catch(function (err) {
}).catch(function (err) {

How it works:

  • particle.getEventStream({ auth: token}) gives me the stream of events. From there I can subscribe to specific event names, or to all public events using the generic name “events”: stream.on(‘event’, function(data).
  • function(data) is a callback function fired when a new event is ready. The event has JSON type “data.” From there I can simply insert it to a document store: .add( data ).execute() will insert the JSON data into the event_stream document store.

One of the reasons I use document store here is I do not have to know what is inside the event data. I do not have to parse it, I simply throw it to MySQL and analyze it later. If the format of data will change in the future, my application will not break.

Inside the data stream

Here is the example of running the above code:

{ data: 'Humid: 49.40 Temp: 25.00 *C Dew: 13.66 *C HeatI: 25.88 *C',
  ttl: '60',
  published_at: '2016-05-20T19:30:51.433Z',
  coreid: '2b0034000947343337373738',
  name: 'log' }
Wrote to MySQL: {"_state":{"rows_affected":1,"doc_ids":["a3058c16-15db-0dab-f349-99c91a00"]}}
{ data: 'null',
  ttl: '60',
  published_at: '2016-05-20T19:30:51.418Z',
  coreid: '50ff72...',
  name: 'registerdev' }
Wrote to MySQL: {"_state":{"rows_affected":1,"doc_ids":["eff0de02-726e-34bd-c443-6ecbccdd"]}}
{ data: '24.900000',
  ttl: '60',
  published_at: '2016-05-20T19:30:51.480Z',
  coreid: '2d0024...',
  name: 'Humid 2' }
{ data: '[{"currentTemp":19.25},{"currentTemp":19.19},{"currentTemp":100.00}]',
  ttl: '60',
  published_at: '2016-05-20T19:30:52.896Z',
  coreid: '2d002c...',
  name: 'getTempData' }
Wrote to MySQL: {"_state":{"rows_affected":1,"doc_ids":["5f1de278-05e0-6193-6e30-0ebd78f7"]}}
{ data: '{"pump":0,"salt":0}',
  ttl: '60',
  published_at: '2016-05-20T19:30:51.491Z',
  coreid: '55ff6...',
  name: 'status' }
Wrote to MySQL: {"_state":{"rows_affected":1,"doc_ids":["d6fcf85f-4cba-fd59-a5ec-2bd78d4e"]}}

(Please note: although the stream is public, I’ve tried to anonymize the results a little.)

As we can see the “data” is JSON and has that structure. I could have implemented it as a MySQL table structure (adding published_at, name, TTL and coreid as separate fields). However, I would have to depend on those specific fields and change my application if those fields changed. We also see examples of how the device sends the data back: it can be just a number, a string or another JSON.

Analyzing the results

Now I can go to MySQL and use SQL (which I’ve used for >15 years) to find out what I’ve collected. First, I want to know how many device names I have:

mysql -A iot
Welcome to the MySQL monitor.  Commands end with ; or g.
Your MySQL connection id is 3289
Server version: 5.7.12 MySQL Community Server (GPL)
Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
Type 'help;' or 'h' for help. Type 'c' to clear the current input statement.
mysql> select count(distinct json_unquote(doc->'$.name')) from event_stream;
| count(distinct json_unquote(doc->'$.name')) |
|                                        1887 |
1 row in set (5.47 sec)

That is slow! As described in my previous post, I can create a virtual column and index for doc->’$.name’ to make it faster:

mysql> alter table event_stream add column name varchar(255)
    -> generated always as (json_unquote(doc->'$.name')) virtual;
Query OK, 0 rows affected (0.17 sec)
Records: 0  Duplicates: 0  Warnings: 0
mysql> alter table event_stream add key (name);
Query OK, 0 rows affected (3.47 sec)
Records: 0  Duplicates: 0  Warnings: 0
mysql> show create table event_stream
*************************** 1. row ***************************
       Table: event_stream
Create Table: CREATE TABLE `event_stream` (
  `doc` json DEFAULT NULL,
  `_id` varchar(32) GENERATED ALWAYS AS (json_unquote(json_extract(`doc`,'$._id'))) STORED NOT NULL,
  `name` varchar(255) GENERATED ALWAYS AS (json_unquote(json_extract(`doc`,'$.name'))) VIRTUAL,
  UNIQUE KEY `_id` (`_id`),
  KEY `name` (`name`)
1 row in set (0.00 sec)
mysql> select count(distinct name) from event_stream;
| count(distinct name) |
|                 1820 |
1 row in set (0.67 sec)

How many beers left?

Eric Joyce has published a Keg Inventory Counter that uses a Particle Proton device to measure the amount of beer in a keg by 12oz pours. I want to see what was the average and the lowest amount of beer per day:

mysql> select date(json_unquote(doc->'$.published_at')) as day,
    ->        avg(json_unquote(doc->'$.data')) as avg_beer_left,
    ->    min(json_unquote(doc->'$.data')) as min_beer_left
    -> from event_stream
    -> where name = 'Beers_left'
    -> group by date(json_unquote(doc->'$.published_at'));
| day        | avg_beer_left      | min_beer_left |
| 2016-05-13 |  53.21008358996988 | 53.2          |
| 2016-05-18 |  52.89973045822105 | 52.8          |
| 2016-05-19 | 52.669233854792694 | 52.6          |
| 2016-05-20 |  52.60644257702987 | 52.6          |
4 rows in set (0.44 sec)


UDocument Store can be very beneficial if an application is working with a JSON field and does not know or does not care about its structure. In this post, I’ve used the “save to MySQL and analyze later” approach here. We can then add virtual fields and add indexes if needed.

Powered by WordPress | Theme: Aeros 2.0 by