Article posted on : link to source
When measuring and testing computer applications, scientists and engineers collect huge amounts of data every second of the day. For instance, the world’s largest particle holder collider known as Large Hadron Collider generates approximately 40 terabytes of data per second. The jet engine of a Boeing creates approximately ten terabytes of data every thirty minutes. When a Jumbo jet takes a trip across the Atlantic ocean, the four engines on the jet can produce approximately 640 terabytes of data. If you multiply that kind of data with an average of 2,500 daily flights, the amount of data produced per day is staggering; this is what is called Big Data.
It is a difficult task to draw conclusions and get actionable data from the large sums of data, and Big Data encompasses this issue. Big data has brought about new ways of processing data; we have deep data analysis tools, data integration tools, search tools, reporting tools and maintenance tools that help in processes big data to derive value from it.
The International Data Corporation (IDC) performed a study in which music, video files, and other data files were analyzed. The study indicated that the amount of data being produced by systems is …
Read More on Datafloq