Acceleration through AI

The speed economy in data science: don't lag behind

Ever since Gordon Moore identified the relation between increasing complexity and capability of integrated circuits and time in 1965, the number of transistors on a microchip doubles every two years, a steady increase in compute speed per Central Processing Unit (CPU) can be observed.

By pushing the boundaries of technology Moore’s law still holds today. Now, assuming Moore’s law is limiting on the processing speed of a CPU, how can shorter processing times be obtained? This can be achieved on the software level, by vectorization, parallelisation and using low-level programming languages like C/C++ or optimised libraries, but also by paralysing the computations on the hardware level.

Of course, you could use multiple multi-core CPU’s at once, but the fundamental architecture of a CPU limits it performance in terms of data processing speeds. For this reason, Graphical Processing Units (GPU) gained a lot of popularity the last couple of years. Due to a GPU’s architecture, having many smaller more efficient cores and a high memory bandwidth, computations can be massively parallelised resulting in tremendous fast throughputs and processing speeds.

 

GPU’s as enabling technology of AI.

Artificial Intelligence (AI) has become publicly well known in a noticeably short time. According to Google Trends the number of times the term “AI” was used in a Google search really skyrocketed around December 2021. Interestingly, the underlaying concept of AI, neural networks, has already been described in the 1940’s by Warren McCulloch and it was revisited by Geoffrey Hinton and Yann LeCun during the 80’s and 90’s. However, significant developments are only made the last 15 years.

 

So, what was holding back these technological advancements?

First, data availability. The functionality, precision, and effectiveness of deep (learning) neural networks improves with increasing amounts of high-quality data. Second, algorithm improvements in backpropagation, activation functions (ReLU) and weight initialisation have been made. Third, Industrial adaptation by for example Google (BERT), Meta/Facebook (LIAMA2), Microsoft (Copilot) and OpenIA (ChatGPT) investing heavily in the technology and making the algorithms opensource helped to flourish the AI area.

But these technology pushes still rely heavily on computational power. This hurdle was overcome by the development of high-performance GPU’s. One of the GPUs that is considered ground-breaking for the field of AI is the NVIDIA Tesla K40, which was introduced in November 2013. Since that time, many improvements in the artificial intelligence computing hardware have been made, enabling the deployment of more complex AI-models such as Large Language Models (LLMs), Generative Models (GANs) and Object Detection Models (ODMs) among others.

Harnessing the power of AI in publicly available platforms, have been adopted by the industry, E-commerce, insurance companies, agriculture and many more, but also accelerates scientific research and improves healthcare.

 

AI in healthcare and example

The role of AI in healthcare became apparent during the Corona pandemic, during which AI played a key role in various aspects in fighting this crisis. A couple of examples in which AI was deployed, epidemiological surveillance; monitoring social platforms and Google searches for signs of virus spread, social distance monitoring by computer vision, disease outbreak forecasting, screening and detection of the disease and speeding up genome sequencing. The former is very important to effectively develop a vaccine. DNA dictates the primary protein structure, on which the tertiary protein structure is based upon. It helps if one knows the tertiary structure, the overall three-dimension folding of a protein, because it provides information to which sides a drug can bind. Google Deepmind is used to predict the protein structures of the virus. Also, in the scientific community, knowledge sharing is commonly done by writing and reading scientific articles. Reading all relevant literature is very time consuming but can be accelerated by using LLMs on all available scientific research, boosting research, development, and innovation.

 

The role of AI in healthcare became apparent during the Corona pandemic, during which AI played a key role in various aspects in fighting this crisis. A couple of examples in which AI was deployed:

  • Epidemiological Surveillance: AI was used to monitor social platforms and Google searches for early signs of virus spread.
  • Social Distance Monitoring: Computer vision technology helped in ensuring social distancing norms were followed.
  • Disease Outbreak Forecasting: AI played a role in predicting the spread and impact of the disease.
  • Screening and Detection: AI aided in the rapid screening and detection of the disease.
  • Speeding Up Genome Sequencing: This was crucial for developing a vaccine. Understanding DNA, which dictates the primary protein structure, is vital as it leads to the tertiary protein structure, the overall three-dimensional folding of a protein. Knowing this structure is beneficial as it shows where drugs can bind.
  • Protein Structure Prediction: Google Deepmind was used to predict the protein structures of the virus.
  • Accelerating Scientific Research: AI, particularly Large Language Models (LLMs), was employed to quickly review and analyze scientific literature, significantly speeding up research, development, and innovation. This helped in knowledge sharing within the scientific community, which traditionally relies on writing and reading scientific articles.

 

AI in the cloud

Everyone who trained an AI model, independent of the use-case, knows that it takes significant computing resources and time to do so. This scales with model complexity and the amount of data available.

The cost of buying and the knowledge required to maintain such high-level computing environment on premise are significant. Many cloud providers, such as Amazon, Bytesnet, Google and Microsoft offer these computing resources as a commercial product, providing every person the required computing resource in a flexible manner.

However, you must consider the difference between these cloud providers. Like costs, privacy, location, flexibility, CO2-footprint, support and of course a user’s own knowledge on MLOPS software. If you have little interest or knowledge to maintain a local or cloud computing environment, other software platforms are available. A solution such as the AI infrastructure platform of UbiOps allows you to quickly run an AI & ML workload without any infrastructure knowledge. Basically, turning every computer into a supercomputer.

Many ML (Machine Learning) and AI (Artificial Intelligence) tools have been developed the last decade and already play a major role in our daily life’s being an integral part of society.

All these new developments can be overwhelming but are probably only the tip of the Iceberg. Exciting time lay ahead and it’s probably not the question when you will be using AI but how. So, come on board and don’t miss out.

factsheet cover

eBook

Download "Data Science Insights into AI Processing", the eBook for starting data scientists and analist, now for free.

Download!

eBook download

Fill out this form to download the eBook.
  • Why do we ask for this information? Bytesnet uses your personal data exclusively for research, marketing and sales purposes. By this we mean personal contact by telephone, e-mail and/or direct mailings. Learn more in our privacy statement.
  • This field is for validation purposes and should be left unchanged.

Contact us

Contact us to learn more about our unique offerings in Data Science