Skip to content

Students who have not changed their password since 7 May cannot log in to the student web. Read about how to change your password.

Main menu hidden.
Published: 2022-08-02

Enabling 100-times faster fluorescent imaging of 3D cancer spheroids

NEWS In a new study, researchers at Umeå University and the company Sartorius in Umeå combine deep learning with automatic fluorescence microscopy to make images of 3D cancer tumors. The method speeds up the process by a factor of 100, which was previously not possible. The results are published in the journal PLOS ONE.

Text: Ingrid Söderbergh

We can reach the same conclusion as the traditional Z-stack method, but in a fraction of the time.

“Our new method makes it possible to study 3D cancer tumors to a greater extent with more generated images in a shorter time. This, in turn, can lead to more confident conclusions in research, better understanding and faster development of new vital drug treatments,” says Edvin Forsgren, PhD student at the Department of Chemistry at Umeå University.

The usual way to grow cells is on a plate, so-called 2D cell cultures. In the current study, cancer tumours (clusters of cancer cells called cancer spheroids) are grown in a gel instead of a plate and therefore grow in 3D, which is more like how they grow in real life. However, the images taken are still in 2D.

Fluorescence microscopy of 3D cancer tumours is challenging as a special Z-stack method is required. A Z-stack means that many images with different depth of field are added on top of each other and together create a "stack" of images along the Z axis. The many images with different depth of field contribute to a sharp image of the entire 3D sample.

This traditional method has two major disadvantages: on the one hand, taking pictures of large samples takes a long time, and on the other hand, the longer cells are exposed to fluorescent light, the greater the risk of phototoxicity and photobleaching. These problems are exacerbated when samples are analysed over time, which requires taking repeated Z-stacks and is then limited by the number of images possible to take on live cell cultures.

An alternative to Z-stacks is an image taken by opening the shutter and moving the camera along the Z axis through the entire cell cluster, providing a blurred image of fluorescence over the entire Z dimension. Taking this image, called a Z-sweep, is a hundred times faster than the traditional Z-stack method, but the result is blurry. In the study, Edvin Forsgren and colleagues at the company Sartorius combine the two imaging techniques with deep learning methods (AI) to solve the problem.

“We have optimized an artificial neural network to convert Z-sweeps to Z-projections (which come from Z-stacks). When the AI ​​network has been trained on a few hundred images, we can then give it new, unseen Z-sweeps and the AI ​​network then generates new images that are roughly equivalent to the Z-projections,” says Edvin Forsgren.

In the study, the researchers show that the "trained" deep learning network can produce high-quality fluorescence microscopy images based on blurred Z-sweeps with high confidence.

They have tested the method on different types of cell cultures – both single and multispheroid images. In both cases, they found that the fluorescent signal from the new method maintained similar intensity to the slow, traditional images based on Z-stacks.

This means that the researchers can draw the same biological conclusions on cell culture samples while reducing the time required by a factor of 100.

“We carried out further experiments where cancer tumours were treated with cytotoxic compounds where we used our new method to quantify the effect of the treatment over time. It was once again shown that we can reach the same conclusion as the traditional Z-stack method, but in a fraction of the time.”

Information about the scientific article:

Forsgren E, Edlund C, Oliver M, Barnes K, Sjögren R, Jackson TR (2022): High-throughput widefield fluorescence imaging of 3D samples using deep learning for 2D projection image restoration. PLoS ONE 17(5): e0264241.


For more information, please contact:

Edvin Forsgren
Doctoral student