How much data is too much? Edge Computing states a case for smaller data analysis
Gone are the days of businesses making decisions based on hunches or doing things because ‘that’s the way that they’ve always been done’. Instead, data is fuelling the next wave of business innovation, allowing companies to make smarter decisions, often in real-time. Key to this success is big data, which can best be described as mining data sets that are too large to be processed by traditional software. Big data allows companies to make sense of vast tracts of information, including formatted and raw streaming data, to more accurately model outcomes, or even to find unknown correlations. For example, it was discovered that the anti-depressant drug Desipramine had potential benefits when dealing with some forms of lung cancer. The connection wouldn’t have been made without the connections unearthed by big data analysis.Too much poor data
With an increase in the number of IoT devices, there has been a huge opportunity to use the extra data collected to bring even greater insight into businesses. But perhaps there’s now too much data, not all of it useful. Cisco has estimated that, driven by IoT devices, we will generate a staggering 847 zettabytes (ZB) of data per year by 2021. Simply dealing with this overwhelming level of data using cloud systems will require ever-more expensive infrastructure. The greater challenge is in separating useful data (that which you can act on) from the noise. Indeed, Cisco estimates that, by 2021, useful data will only be around 85 zettabytes – just 10% of the total generated. Edge computing and edge AI may hold the answer, however, and there are two main ways that it can help.Improved data collection
Cloud-based big data analysis is incredibly powerful and the more useful information you can give a system, the better the answers you can get from the questions you ask. For example, in a retail environment, demographic data collected by a facial recognition system can add detail to simple sales information, letting you know not just what you sell, but who’s buying it.
Filtering data on the edge
With an Edge Computing solution, a computer connected to a camera can automatically strip out demographic information, sending that to the cloud for storage and processing. This dramatically cuts down on the amount of data collected, providing purely useful information. Likewise, with IoT sensors, is it necessary to send measurements every second for storage? By storing data locally and, perhaps, using averages for larger time periods, Edge Computing can cut down on noise, filtering data to provide only useful and relevant information. Most importantly, perhaps, in an age where people are worried about security and privacy, Edge Computing offers a responsible and secure way to collect data. Again, turning to our demographic information, no private video or facial data is sent to a server, rather an Edge Computer can take the useful, non-personalised data and transmit this to the cloud.Real-time data processing
Big data analysis has two main modes of implementation: data modelling and real-time. Modelling helps give business insights and the big picture, real-time data lets you react to what’s happening right now.