Smart buildings around the country are now much more efficient and secure, thanks to sensors that monitor temperature, electricity, video surveillance and more. Manufacturers can ensure better quality control as a result of thousands of sensors measuring machine operating data and part quality. Retailers can provide better customer service, thanks to smart shelves that can detect low inventory and smart shopping carts that can direct customers to the right aisles based on their digital shopping lists. Hospitals and doctors’ offices can improve patient care by using remote monitoring for everything from heart rate to blood sugar levels.
All of these advances are possible through the explosion of Internet-connected sensors, devices and systems. Commonly—and collectively--called the Internet of Things (IoT), these devices each have their own IP addresses, allowing them to continually feed data to a company’s IT infrastructure, where it can be collected, analyzed and managed.
“IoT is creating tons of new data as every sensor becomes another data source, but that’s only the beginning,” says Mike Matchett, senior analyst at Taneja Group. “As organizations realize that they can move from reporting on a sensor once a month to once every 30 seconds, and as they realize that there is more data they want to collect from sources they never realized could be valuable, the amount of data becomes very big very quickly.”
That’s the issue companies are grappling with today. According to IDC, the installed base of the IoT will grow to about 212 billion globally by the end of 2020.
This rapid growth is quickly becoming too much for many traditional IT infrastructures to handle. To manage all of this IoT data, servers, storage and network capacity must be fully flexible and scalable. If not, the constant stream of data could easily inundate a corporate network or take up so much bandwidth that there isn’t enough left for other processes. Likewise, storage has to be fast, plentiful and cost-effective in order to deal with the amount of data that needs to be stored.
Recent research from Gartner backs this up, noting that IoT will force companies to rethink the way they manage capacity across all layers of the IT stack. “Data center managers will need to deploy more forward-looking capacity management in these areas to be able to proactively meet the business priorities associated with IoT,” said Gartner VP Joe Skorupa in a statement.
With its flexibility and scalability, converged infrastructure can be a good solution to the influx of IoT data. To more effectively process, manage, store and analyze IoT data, some vendors have taken it a step further by actually optimizing their converged infrastructures for IoT and the resulting big data it creates.
The most important optimizations will take place in the storage architecture, Matchett said. Some converged infrastructures have moved to a scale-out design that offers enterprise data protection and also allows big data to be accessed through a variety of protocols. Many also have begun incorporating or integrating with Hadoop and Apache Spark, which are ideal for the log structured data that makes up most IoT data, he added.
Underwritten by HPE
Part of HPE’s Power of One strategy, HPE Converged Architecture 700 delivers infrastructure as one integrated stack. HPE Converged Architecture 700 delivers proven, repeatable building blocks of infrastructure maintained by one management platform (HPE OneView), built and delivered exclusively by qualified HPE Channel Partners. This methodology saves considerable time and resources, compared to the do-it-yourself (DIY) approach.
Based on a complete HPE stack consisting of HPE BladeSystem with Intel® Xeon® E5 v3-based HPE ProLiant BL460c Gen9 blades, HPE 3PAR StoreServ all-flash storage, HPE Networking, and HPE OneView infrastructure management software, the HPE Converged Architecture 700 can be easily modified to fit within your existing IT environment.