Data has become the most important resource for business today, with companies tapping any available source to gain greater insights into their customers and their operations. The amount of global data created, captured, copied and consumed grew from two zettabytes in 2010 to 64 zettabytes in 2020 and is projected to hit 181 zettabytes by 2025.
Like every other industry, manufacturing has been transformed as equipment goes high tech, and sensors are embedded in products on the factory floor, on the high street, in homes and outer space, and just about everywhere else.
“We’re applying new sensor systems … where all the delicious information is hidden,” said Philipp Niemietz (pictured, right), intermediate head of the department at the Laboratory for Machine Tools and Production Engineering (known as WZL) at RWTH Aachen University in Germany. “We’re getting close to the process, applying video data streams, more sensor data, and even [internet of things] scenarios. We’re talking about sensors that have like maybe a million data points a second.”
Niemietz and Russ Caldwell (pictured, left), senior product manager at Dell Technologies Inc., spoke with Lisa Martin, host of theCUBE, SiliconANGLE Media’s livestreaming studio, for a digital CUBE Conversation on how the enhanced video capabilities of Dell EMC’s streaming data platform are enabling manufacturing, anomaly detection, and quality control through the use of sensors, cameras and X-ray cameras, as well as how research labs such as WZL can use this to drive their innovation process. (* Disclosure below.)
Dell SDP condenses time to unify incoming and historical data resources
Being able to embed sensors and collect all this data is an immense leap forward on the Industry 4.0 timeline. But capturing it is only the first step. Incoming data must be analyzed in real time in order to capitalize on its value. Dell’s Streaming Data Platform was designed to make gathering and analyzing real-time streaming data simple, supporting modern industrial use cases such as drones with livestream video capabilities, resource tracking for supply-chain logistics, and predictive maintenance and safety monitoring.
Answering the needs of customers across manufacturing industries, the SDP platform then takes the ability to gain insights one step further: It combines real-time and historical data analysis.
“[With] this unified concept of time, we … do not have to separate domains that we have to deal with,” Niemietz said.
Previously, the lab spent a lot of time dealing with DevOps for its Hadoop stack, which was built using Apache Spark for traditional historical analysis and Kafka and Storm for ingestion and streaming processing. With Dell’s SDP, WZL can put it into one model, reducing the time needed to maintain and handle and implement the code, according to Niemietz.
“That unified concept of time is really powerful because, with one line of code, you can jump to any point on the timeline of your data, whether it’s the real-time data coming off of the sensors right now or something minutes, hours, years ago,” Caldwell explained. “The more data types that you can bring in, the more problems you can solve. So bringing on as many on-ramps and connectivity into other solutions is really important.”
The SDP team is focused on finding those common patterns everywhere so they can make it the norm to analyze streaming data, not just historical batch data, Caldwell added.
Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of enterprise technology, digital transformation, and cultures of innovation. (* Disclosure: Dell Technologies Inc. sponsored this CUBE Conversation. Neither Dell nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)