Home BIG DATA Big Data Is Bigger Than Ever

Big Data Is Bigger Than Ever

Big data has never been more central to our lives than today. This applies, for example, to advanced analysis technologies that make it possible to create value from the data basis and achieve results in complex fields – such as research on COVID-19. So, where is analytics headed next, and what kind of solutions will enable that?

The Big Data Experts generally agree that the amount of data generated will grow exponentially. A recent report by independent analyst IDC predicts that the global data set will reach around 175 zettabytes by 2025. What is the reason for this growth? There is a steady rise in Internet users conducting their lives online, from business communications to shopping to social networking. IDC estimates that within five years, 75 percent of the world’s population will be interacting with online data daily. And it’s not just people driving the growth, as billions of connected devices and embedded systems are now helping to transform the new science of IoT-Form data analysis.

Data analysis has come a long way in a short time. Understanding what can be achieved with data has evolved, as has the maturity of the tools that leverage it. As a result, their value increases in innovative and exciting new ways. Entirely new avenues of data science are opening up, from IoT analytics and advanced analysis of large amounts of data to DataOps.

Diverse Areas Of Application For Analytics

Online retailers can already use analytics to follow the customer journey from initial interest to purchase decision. Each step of the trip is quantifiable and measurable in one way or another. A single customer’s data becomes part of a larger dataset composed of the preferences of thousands of consumers. Analytics professionals leverage the latest software platforms to uncover insights for a more targeted and relevant customer experience.

Modern analytics’s value lies in the important unveiling of information present in the data but was previously inaccessible or invisible. This breaks up and changes the dynamics of an otherwise fixed market. Gartner cites the example of banks and their focus on wealth management services. The traditional view here has been that older customers are likely to be most interested in these products. However, with advanced analysis, the banks found that younger customers, aged 20 to 35, are more likely to use such services. Thorough analysis removed distortion and wrong thinking in one fell swoop.

An even more recent example of the power of analytics is the scientists and researchers working worldwide to find a cure for COVID-19. Scientific computing platforms not least support this vital work. Such platforms accelerate progress, from data analysis to simulation and visualisation to AI and edge processing.

Supercomputers And GPUs As A Basis

For example, Oxford Nanopore Technologies was able to sequence the virus’s genome in just seven hours using fast graphics processors. Using GPU-accelerated software, the US National Institutes of Health and the University of Texas could generate a 3D structure of the virus protein using cryogenic electron microscopy. GPU-driven AI accurately classified COVID-19 infection rates based on lung scans, speeding up treatment plans. And in drug development, Oak Ridge National Laboratory used an InfiniBand-connected, GPU-accelerated supercomputer to study a billion potential drug combinations in just 12 hours.

In developing even faster and more powerful analytics, standards and limits are constantly being broken. One of the most important benchmarks in data analysis is called TPCx-BB. The value includes queries that combine SQL with machine learning on structured data with natural language processing and unstructured data, reflecting the diversity of modern data analysis workflows.

The record for TPCx-BB performance was recently surpassed by almost 20x with the RAPIDS suite of open-source data science software libraries based on 16 NVIDIA DGX-A100 systems. The benchmark was completed in just 14.5 minutes, compared to a previous best result of 4.7 hours on a CPU-powered computer cluster.

Accelerated visualisation solutions that span terabytes of data have applications in other areas of science as well. For example, NASA has used the technology to visualise the landing of the first human-crewed mission to Mars interactively and in real-time in the world’s largest volume rendering.

With the digital transformation, data is now the beating heart of every company. But only with the right technology can these organisations determine which data matters most, unlock the most important insights from that data, and decide what actions to take to leverage that data.

Also Read: Big Data: Definition, Challenges And Applications


Man And Technology Must Collaborate In The New GDPR Era

In the era of heightened awareness of personal data privacy, following the introduction of the GDPR, companies need to be prepared for the influx...

Team And Technical Agility In SAFe

Scaled Agile Framework (SAFe) gives an exhaustive way to deal with executing light-footed standards and practices at scale. SAFe depends on seven center -...

Top Python Libraries For Web Scraping

Web scraping is an effective method for data retrieval and extraction, allowing users to access and gather information from websites using automation tools and...

Income Tax Filing For Startups And Small Businesses In India

Introduction Startups and small enterprises in India have to file taxes. It assures tax compliance and helps firms keep clear financial records. Startups and small...

Word Comments And Revisions: How To Track Changes On The Document

How tracking changes applied to a Word document works: revisions and comments are just a click away. Although collaborative editors are increasingly "popular," Word...

Holding Company: What Is It For, And What Are The Benefits?

Most entrepreneurs have yet to learn the benefits that the establishment of a holding company can offer, thinking that such instruments are only advantageous...