Home BIG DATA Big Data Is Bigger Than Ever

Big Data Is Bigger Than Ever

Big data has never been more central to our lives than today. This applies, for example, to advanced analysis technologies that make it possible to create value from the data basis and achieve results in complex fields – such as research on COVID-19. So, where is analytics headed next, and what kind of solutions will enable that?

The Big Data Experts generally agree that the amount of data generated will grow exponentially. A recent report by independent analyst IDC predicts that the global data set will reach around 175 zettabytes by 2025. What is the reason for this growth? There is a steady rise in Internet users conducting their lives online, from business communications to shopping to social networking. IDC estimates that within five years, 75 percent of the world’s population will be interacting with online data daily. And it’s not just people driving the growth, as billions of connected devices and embedded systems are now helping to transform the new science of IoT-Form data analysis.

Data analysis has come a long way in a short time. Understanding what can be achieved with data has evolved, as has the maturity of the tools that leverage it. As a result, their value increases in innovative and exciting new ways. Entirely new avenues of data science are opening up, from IoT analytics and advanced analysis of large amounts of data to DataOps.

Diverse Areas Of Application For Analytics

Online retailers can already use analytics to follow the customer journey from initial interest to purchase decision. Each step of the trip is quantifiable and measurable in one way or another. A single customer’s data becomes part of a larger dataset composed of the preferences of thousands of consumers. Analytics professionals leverage the latest software platforms to uncover insights for a more targeted and relevant customer experience.

Modern analytics’s value lies in the important unveiling of information present in the data but was previously inaccessible or invisible. This breaks up and changes the dynamics of an otherwise fixed market. Gartner cites the example of banks and their focus on wealth management services. The traditional view here has been that older customers are likely to be most interested in these products. However, with advanced analysis, the banks found that younger customers, aged 20 to 35, are more likely to use such services. Thorough analysis removed distortion and wrong thinking in one fell swoop.

An even more recent example of the power of analytics is the scientists and researchers working worldwide to find a cure for COVID-19. Scientific computing platforms not least support this vital work. Such platforms accelerate progress, from data analysis to simulation and visualisation to AI and edge processing.

Supercomputers And GPUs As A Basis

For example, Oxford Nanopore Technologies was able to sequence the virus’s genome in just seven hours using fast graphics processors. Using GPU-accelerated software, the US National Institutes of Health and the University of Texas could generate a 3D structure of the virus protein using cryogenic electron microscopy. GPU-driven AI accurately classified COVID-19 infection rates based on lung scans, speeding up treatment plans. And in drug development, Oak Ridge National Laboratory used an InfiniBand-connected, GPU-accelerated supercomputer to study a billion potential drug combinations in just 12 hours.

In developing even faster and more powerful analytics, standards and limits are constantly being broken. One of the most important benchmarks in data analysis is called TPCx-BB. The value includes queries that combine SQL with machine learning on structured data with natural language processing and unstructured data, reflecting the diversity of modern data analysis workflows.

The record for TPCx-BB performance was recently surpassed by almost 20x with the RAPIDS suite of open-source data science software libraries based on 16 NVIDIA DGX-A100 systems. The benchmark was completed in just 14.5 minutes, compared to a previous best result of 4.7 hours on a CPU-powered computer cluster.

Accelerated visualisation solutions that span terabytes of data have applications in other areas of science as well. For example, NASA has used the technology to visualise the landing of the first human-crewed mission to Mars interactively and in real-time in the world’s largest volume rendering.

With the digital transformation, data is now the beating heart of every company. But only with the right technology can these organisations determine which data matters most, unlock the most important insights from that data, and decide what actions to take to leverage that data.

Also Read: Big Data: Definition, Challenges And Applications

RECENT POSTS

Team GmbH: 40 Years Of Pioneering In Digitization

Team GmbH is now celebrating its 40th anniversary and the inauguration of a building extension. The IT company was founded in 1982 in a...

Why Do I Need A Digital Agency?

Digital marketing is any form of marketing or advertising that is delivered through digital channels: websites, social networks, mobile apps, email, search engines, and...

Sanction List Check: On The Safe Side With The Right Software

Companies are forced to check whether their business contacts and employees are on international sanctions lists. Otherwise, there is a risk of high penalties...

Scrum Methodology – How It Benefits The Company & Individuals?

Scrum is a framework that helps teams work together. This framework encourages the teams to learn through experience while resolving a problem in the...

Amazon Fire TV Stick: TV Becomes Smart And Can Be Controlled With Your Voice

Amazon Fire TV Stick is a system to make a regular TV smart, and thanks to the interaction with Amazon Alexa, it is also...

Five Shipping Optimization Tips For Online Retailers

Online trading has been pushing the transport industry to its limits for several years. More and more people are ordering their clothes, cosmetics, and...