Now that everyone has basically come to the conclusion that the Internet of Things has merit as an idea, there are a lot of application development challenges conspiring to retard IoT progress. These range from a general shortage of developers with IoT expertise to immature business models.
For the most part, IoT business models are pretty much an act of faith at the moment. There's not much in the way of proof that anything relating to the continuous updating of software running on embedded systems is actually viable. Most IoT development work today assumes not only that organizations have devices can be remotely updated, but also that customers are willing to pay extra for those kinds of services.
In a world where anything involving machine-to-machine applications tends to operate on razor-thin margins, organizations moving into the IoT space will have to make sure the business models behind any particular project are sound enough to support that project through a lifecycle that could last a decade or more.
Unfortunately, at the moment there is a lot more unknown than known about what the market will truly bear, especially when it comes to distinguishing between what customers would perceive as a unique, value-added service worth an extra fee and something that should be baked into the customer service experience.
"Business models are by no means set in stone," James Brehm, principal analyst for James Brehm & Associates, told me. "There's a lot of trial and error going on".
The next major challenge from a development perspective is the general shortage of developers with the skills to develop these applications. Demand for IoT developers is starting to outstrip supply, and the cost of hiring those developers gets higher with each passing day. Yet being able to develop applications that run on IoT endpoints will be critical to the success of an IoT endeavor.
At the recent Machine-2-Machine Evolution conference that was part of an ITEXPO event, Peter Utzschneider, Oracle vice president of Java product management, said simply shoveling all the data that is collected by all the endpoints in an IoT environment won't be viable.
Event-driven architectures where a fair amount of processing takes place on the endpoint before data gets shipped back to the datacenter will be required, he said. Only then can the amount of network bandwidth being consumed be limited in a way that keeps networking costs reasonable. That capability is also critical to ensuring the performance of IoT applications, many of which will be sensitive to network latency.
Utzschneider also said that IoT application developers will assume a level of IoT interoperability that currently doesn't exist. Oracle is proposing that Java play a significant role in providing that interoperability. But in the meantime, he said, many organizations are in danger of being locked into proprietary architectures. "Interoperability is going to be a massive challenge. "There's something of a land grab going on where vendors are trying to lock as many things up as best they can."
The cost of providing the infrastructure to support the big data analytics applications needed for an IoT deployment can be substantial. Technologies such as Hadoop have made it less expensive to store data, but the number of servers required to support a Hadoop cluster is still quite large. Naturally, many organizations are looking to public cloud services to get around that issue. But the more data the cloud stores, the more expensive those services become over time.
None of this means that a robust IoT application environment won't eventually emerge. But it does mean that, from a developer's perspective, a number of issues need to be addressed before the IoT becomes the developer opportunity that everyone envisions.