Carl Ford of Crossfire Media and the M2M Evolution magazine advisory board recently spoke with Jerry Coumo and Gari Singh of IBM (News - Alert) about Big Blue’s views on machine-to-machine solutions, including MQ Telemetry Transport (MQTT), a lightweight publish/subscribe protocol flowing over TCP/IP for remote sensors and control devices through low bandwidth, unreliable or intermittent communications.
Coumo commented up front that something like MQTT could really make M2M work efficiently, but wouldn’t alone do the job. It would need intelligence behind it. “It’s the combination of that with some of our analytic capabilities that really make this very interesting and compelling for IBM,” he said.
Here’s the rest of the interview.
CF: In the Java World you were Sun’s No. 1 supporter in building Java. Has this come anywhere near Java embedded?
CF: With M2M, are there network I/O concerns from your perspective, as $6-billion WebSphere world suggests it sounds like there is? Tell me a little bit about that. Are you going building massive analytics?
GS: I always frame things in simple questions. So you have big data and I ask the question: Where does big data come from? We talk about processing the data, but where does it actually come from? There is an opportunity to drive big data with big messaging. The opportunity out there is…we have all these devices and if you consider the fact that you have 3G and 4G but ubiquity of TCP is the game changer, I think that's what allows us to do everything. You can get all kinds of data rate from all kinds of devices whether it’s locations, cars sending you data. All that type of stuff can drive massive amounts of data. We actually work on two interesting parts I think within IBM on the back side. One is: How do you consume all that data and buffer that data to get it to the right applications? That has been the focus of what I've been working on in terms of high-scale messaging infrastructure for pub/sub eventing, because eventually all this stuff is eventing. And on the other side we have a couple of different technologies for doing processing.
Things like Hadoop, big insights, and big analytics products for big data. We also have some other interesting technology, which is a kind of a streams processing technology where we can actually do analytics on the big streams of data as they are in motion.
So you can start to imagine that I'm getting big volumes of sensor information – cars reporting their information; people reporting their information; and I actually want to detect instantaneous response based on some correlation of events or analytics.
Those are the interesting things that we found and are working on, and when we talk to customers about this they are saying that makes a lot of sense. We've actually found a number of customers who have that need even though they did not quite state it that way. So big messaging and big data equals big analytics.
CF: Does this mean we are we heading toward predictive systems? Is that a possibility when you listen to the streams?
GS: Exactly. Predictive analytics. That’s what we can do, and it’s pretty interesting.
JC: The more data, the closer you are to making sense and gaining insight of what’s happening. Some of the things that Gari is mentioning are pretty key. One of the things that big messaging is doing is adding some level of intelligence to the data coming in, by tagging it, correlating it with the other things passing through. So it’s almost doing a pre-process of analyzing the data, almost in the network as it is being delivered as it is arriving into the raw data pool with some level of interest. Simple, very simple, but powerful things that we are working on are telemetrics types of things and location technology. Location services is a pretty popular thing and at the root of a lot of these use cases. You are doing a geo-fence and you're saying: Is this thing inside the geo-fence or outside the geo-fence? You can tell the messaging system don’t just deliver me this data, tell me if this data is inside or outside this point.
This could help quiet the noise because even though we want every message from every device that we can get our hands on, we also want to be able to preprocess the data preferably in the network. Then when we get the predictive stuff it’s much more targeted; it's not [about] looking through a sea of information. It’s narrowed the information down toward the subject matter at hand. It makes it much more efficient. It’s also efficient from the optimization of time, space, storage.
Edited by Braden Becker