Callum Welsh, Art Director, and Leo Chilcott, Production Director, spotlight on the world of Medical Visualizations and Visual Effects 


 

Cebas: Hello Callum and Leo, right from your landing page at Random42.com, the visitor is taken into a virtual ride diving into a molecular environment - does this say something about the scientific world of today and where innovation is heading?

 

myelinsheath

 

Callum Welsh: There are some incredible body “mechanisms” that occur inside every one of us, however, it is surprising how little the majority of people can know about them. It can be tricky to visualise these processes through microscopic or even scanning electron microscopy (SEM) imagery, even though they can be the key to curing life changing and/or terminal diseases.

As a result, the pharmaceutical industry is always looking for new ways to visualise these mechanisms, to help explain to consumers, how their drugs can treat diseases on a molecular level.

Advances in VR and AR give us more ways to visualise these processes and we spend a lot of time on R&D’ing new ways for consumers to interact and be educated about these body mechanisms and drug treatments.

Cebas: Callum, you're the Senior Art Director at Random42, could you please tell us what the Studio name ‘Random42’ symbolizes?

Callum: As I’m sure some people are aware, “42” is the meaning of life in Hitchhiker’s Guide to the Galaxy. The “Random” part, I am told, comes from our founder, Hugo Paice, reading about chaos theory at the time and so he came up with the name by combining these two ideas together.

Cebas: 'Hugo Paice' - unique name.  Leo, hello, and you're the Production Director at Random42, please tell us why there is such an exponential growth in the recent years for 3D medical visuals? What are some of the driving forces?

Leo Chilcott: Medical visuals are all about helping us understand what the naked eye cannot see. Scanning Electron Microscopes (SEM) are needed to see anything at that microscopic level of detail. Even then, due to the nature and size of these microscopes, it is extremely difficult to film inside a live human body zoomed in to that degree. Due to improvements in computer visual technology, 3D is becoming faster and easier to produce, and at a more realistic level of detail is now possible. We are starting to be able to show these complex inner body processes and making it very clear exactly what is happening. This is highly effective in educating people on how our bodies really work and quite frankly it's fascinating. 

teamwork Random42

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Image: artists working in studio

Cebas: Leo, now that is going quite deep for anyone reading this with no prior production experience in medical art. Could you perhaps tell us a little about what your work as a Production Director encompasses?

Leo: I oversee our Production department, which encompasses all of our artists and designers. As Production Director, I am responsible for making sure our Art Directors have all the resources required to meet their project deadlines and ensure our pipeline is as optimised as possible, keeping track of incoming projects and future planning, including any recruitment requirements that may arise. I am also responsible for checking in on all our artists to make sure they have everything they need and to help improve and develop their skills. In addition to this, I direct projects and teams of artists, along with the Art Directors, as we find it is key to be involved in production if you are to understand how to manage it. I act as the bridge between our Production Department and the Marketing and Science Departments, including our CEO and Medical Director, working to ensure everyone is on the same page and any problems are highlighted and addressed.

Cebas: Callum, what is your role at Random42?

Callum: I am a Senior Art Director at Random42. Primarily, I lead a team of artists, taking scripts produced by our science team and fleshing them out into storyboards (more akin to look development in other studios) and then once approved, into our final animations. I will also assist with any technical problems around the office and help our newer artists learn any software/plugins we use, including Thinking Particles (TP).

Cebas: Did you start with medical visualization training right from college? What attracted you to medical work and unlike many, away from the big movie visual effects?

Callum: Ha, a bit of a happy accident perhaps? I had just graduated from university (3D Digital Animation at University of Hertfordshire) and was contacted by one of my lecturers who said that Hugo, Random42 founder, was looking for some artists. It turns out they had hired some graduates a year before from the same course.

For me, at the time, being offered a full time position was a huge relief, especially in an industry of freelance and short-term contracts, and it brought a lot of stability to my life, to which I am very thankful. I never had medical animation on my radar at university, which I think applies to a lot of the people working here as well; they have come from other backgrounds.

Over the years I have really enjoyed the work here. There is always unique and interesting challenges, so the work never stagnates, and you get to pick up a lot of the science along the way.

Cebas: Callum, why did you decide that Thinking Particles was the best tool for your projects and did our software help you achieve the desired effects without much hiccups?

Callum: I love using TP for keeping the animation procedural. We have a short turnaround for animations and it allows us to be able to make pretty broad changes to scenes, such as, if the client wants more cells in a division scene. I remember when I first started, I had to update a scene where the cell division had been animated, duplicated and placed, by hand. Iterating the animation timings or amount in the scene was quite a hassle, compared to merely changing some values inside TP. We can also recycle a lot of the TP systems, so for example, in the cell division scene we can switch out cell types and make them divide slower, without having to redo all the animation.

To be able to approach these animations procedurally, we need a very flexible toolkit, as we are animating very different and unique processes scene to scene. The group workflow is fantastic for being able to add in extra behaviour easily. Something I like to do is tackle the hardest part of the system first and add the simpler elements afterwards. As long as I send the particles to the right groups of the TP setup, at any turn, I can slot in new dynamics with no hassle.

Referencing is something I use in almost every system. The GetRef or SetRef Operators can be used to make particles move to exact positions, recall data from a “parent” particle and even make springy movement behaviours. I would definitely consider it one of the fundamentals to learn. ParticleData and Memory nodes enables us to access and create our own channels, allowing us to create timers, map the age to the size of a particle, store and transfer Material IDs/ShapeIDs, and in conjunction with the DataChannel node, export these channels to Frost (Thinkbox).

I think having so many options and not being limited by the operations available allows you to make your own custom systems for the job. I did find the learning curve for TP a little steep; there was only one other colleague learning it at the time and on top of learning the new dynamic/group workflow, I had to learn a lot of the math and logic behind particles too. PFlow is great for simplifying these things for you, but in the long term you have so much more power when you can get into with TP, and tinker with the data channels. For example, TP geometry vs Frost geometry: TP is able to export MltIndex, ShapeIndex, GeomTime, Size and Orientation.

TPgeomVSfrost

 

TP geometry vs Frost geometry. TP is able to export MltIndex, ShapeIndex, GeomTime, Size and Orientation 

Cebas: Callum, medical animations and effects are very different from movie FX; I imagine there are certain pockets of similarities, but mostly different, so can you tell us what your approach is in using Thinking Particles to animate medical visuals?

Callum: A quick disclaimer; I haven't worked on movie FX, but I read around forums/articles and have friends who have. Our pipeline is quite loose. It allows the flexibility on smaller projects, as we can cover a broad range of content, from TV ads to iMAX, animals to the nanoworld. The main thing I notice is that there are a lot of tools purposefully built for certain effects, such as, building demolition. There are also some fantastic forum posts / tutorials covering workflows and approaches for this. However, the medical content we make is usually not covered in the same way; I can’t pull up a tutorial for animating ribosome translation, for example. There are very few people at the moment creating medical animations. This means we have to be very independent when crafting these effects. It can be very stressful at times, but immensely rewarding.

The first stage would be to talk to our scientists about what needs to happen. There are some incredible mechanisms in the body and trying to replicate them as authentically as possible can yield some fantastic animation. Usually certain events need to be triggered, before others can proceed. If we look at a typical blood vessel rupture repair process, we have von Willebrand Factors (vWF) that need to bind to collagen and unravel across the injury site.This allows platelet receptors to bind to the vWF, creating a first layer of repair. These platelets then activate and recruit additional cells to create more layers, before fibrin is deposited and stabilizes the repair. It might be a little heavy on the medical lingo, but it shows the series of events. Usually a process completing is what allows the next event to occur.

Sometimes we can get medical diagrams, but very rarely is there an animation we can reference. Our goal is always to make the part of the process we are interested in as clear as possible - at the end of the day they are educational visuals to accompany a medical script.

Cebas: Thank you so much, Callum, for this immensely thorough explanantion! What do you find usually are the more challenging aspects of medical FX? - would you say creativity is involved or things are absolutely fixed as to how it needs to be presented?

Callum: Funnily enough, the most rewarding and challenging things seem to go hand in hand. I already mentioned the limited CG resources available to us, but reference material is also always a struggle. Depending on the scale of the scene we might have SEM imagery, or blurry microscopic footage, or just some diagrams and descriptions. This does very little to convey the material quality or fine motion and as such gives us a lot of room for artistic interpretation. We are always taking inspiration from other sources; I like trying to add a fluid quality to cells, but a springy quality to protein structures. We use a lot of fruit/underwater photography on the look dev side and I’m constantly taking inspiration from the latest abstract particle videos or game concept art.

Even when we have worked by people like Drew Berry that portrays some of the more accurate interpretations of protein/molecular movement, we will always have to take artistic license for the sake of clarity. Some clients want more stylistic interpretations - matching medical diagrams, some want to simplify the animation of a shot so we can focus on only the relevant parts of the narrative. Scale might also be loosely interpreted so we can visually see where, for example, an antibody is travelling to and binding to.

Cebas: Could you and Leo say more about the Max pipeline, and the major software that you have to integrate to smoothen the daily work? Like you use TP with Houdini, Frost, and even modelling with ZBrush (more on this later on…. )

Callum: All the TP work I do is meshed using Frost. I find it a lot more flexible in terms of managing high-res geometry, especially with the recent inclusion of VRayProxy instances. I can export all the channels I need from TP using DataChannel, and can affect things like vertex color, material ID, shape ID and animation timing. I think it also helps the artists who are unfamiliar with TP to adjust materials and such on a Frost object.

handmodeledbacteria


Hand animated bacteria shape instance retimed over TP particles

My hobby away from work is sculpting characters in ZBrush and I really enjoy its approach for “modelling” organic things. You get a much better result, much faster than trying to model it in polygons. As a result, I have brought this into my workflow. Usually when creating the geometry for an environment I will either model it procedurally or sculpt it. In some cases, I will use TP to start a procedural shape and then sculpt some extra details into it. I try and avoid polygon modelling wherever I can.

ZBrushTP

Geometry simulated in TP, imported into ZBrush before being dynameshed and sculpted

We have also been using Houdini. I feel like I’ve adapted the attribute style workflow into TP using the Memory node. It is always fun to learn new software, new ways of thinking and bring those concepts back into your base software. The alembic import gives us a lot of options with combining the two packages and I hope to do more marriage of the two software in the future.

Cebas: Leo, since your office has thinkingParticles, presumably you are also on the 3dsMax pipeline - is this your main production workflow as well? Do you need to integrate a sizeable amount of different softwares?

Leo: 3dsmax is our base package; all our work is rendered through it. We aim to stay as much as possible within 3dsmax, to help keep the pipeline light and projects agile, but we do branch out when required, also utilising ZBrush, Substance and Houdini to support 3dsmax. This also gives our artists more options when tackling difficult shots.

Cebas: What are the most challenging issues in production, Leo, for a medical studio like Random42?

Leo: Recruitment can be a fairly tricky, as there are few people who have medical work in their showreels. It can be a challenge to assess whether someone is suitable for our field of work, so we look more at their composition, artistic eye and technical problem solving. Assessing base skills as opposed to a particular shot they may have worked on. We can also get a wide variety of work and some complex processes to depict, so making sure we can easily adapt and produce the best work possible in a short space of time is vital.

Cebas: And Leo, since you do recruitment, how different is the training in terms of production versus pure technical art?

Leo: When we recruit new artists, we know they are artistically capable, however, it can be quite a challenge for new hires to work with organic, macro environments. A lot of people are not too familiar with SEM imagery or protein structures, or even where a lot of these environments exist within the body. We find that the skills artists use to model and light normal environments do not always translate very well, so a core part of our training is simply re-introducing the basics with an eye for an abstract environment.

We have animated numerous biological processes over the years and an important aspect of our training is all about sharing our previous knowledge and useful techniques. For example, we have tackled many blood vessel shots over the years and have built up lots of tips and tricks for creating the imagery. However, we never want to shoehorn an artist into a particular way of doing something; instead we like to utilise their creativity to see if they have a different idea on how to tackle something. I wouldn’t say we have anything ‘perfected’, as we are always finding ways to improve and develop our animations, creating richer environments and constantly evolving our overall style.

Cebas: I am curious  - are the colours seen in every parts of a medical animation resembling the actual physical colours of the internal parts and molecules or the colors are mainly for visual reference?

Callum: With the scale we are dealing with, there is no visible light that can be recorded. SEM imagery is using electrons to “take the photo”, so whilst we can get very interesting surface detail, it is not in colour (they may be colourised in images for visualisation purposes only). Some of the protein databases can colourise the proteins based on data, such as its atom types or chains/structures, but this again is only data visualisation. This can still be fed into artwork as inspiration however.

A lot of our colour palettes are purely for visual aesthetic. Sometimes we have brand colours to pick out, sometimes we need a lot of colours to seperate the various proteins from each other. I know it is common to depict macrophages as purple, which I believe comes from a dyeing process used to differentiate them from other cell types. 

medical_brand_color

Example of colouring based on brand colours and for visual clarity

Callum: Here are some sample projects with TP: below is a selection of some of the animations I have setup with TP, along with a quick breakdown of how TP was used.

Progression of bacteria:

bacteriadivide

Still of bacteria dividing

 
NEW: Click to Callum Welsh Blood Repair & Clotting Tutorial with thinkingParticles
 

Cebas: now, that is an amazing visual animation, and so different from what we normally see in movie effects ! Could you elaborate a bit more on how you created it?

Callum: Of course, this TP setup went through a lot of versions over the years as I tried out different approaches. It started out as cells dividing in a void, with bullet shapes spawning on top of each other and Bullet ejecting them out to get the dividing effect. But when we needed to modify it to bacteria dividing it had to be changed into a much more controlled effect.

This scene really showed off how powerful Thinking Particles could be for our animations, with its looping behaviour and physics interaction. It allowed us to have much larger scale shots, where before we would have a close shot that only focused on a few hand animated bacteria dividing.

This is also a great example of how TP can communicate with Frost using DataChannels. When the bacteria divided, they also exported a “GeomTime” channel which is used in Frost to play back a hand animated bacteria on top of the particle animation; in this case it is used for bulging in the center of the bacteria as it first divides.

 

Cells signalling:

 

cellsignalling

Still of cells signalling

Callum: This is our take on a plexus system, where instead of lines, particles travel between each point (or cell). When a cell receives the signalling it activates and starts signalling itself. I went back and added PLight later on, making it so the scene is only illuminated by the signalling particles, which adds a great mood to the shot.

Some further RnD I did with TP latest I think the 6.5 new feature, PLight looked at illuminating a VDB volume rendered through VRayVolumeGrid (and originally simulated in Houdini). I love the potential of PLight so far, it adds so much dynamism to a shot and I look forward to exploring it more in future projects.

 

PlightTP

VrayVol

PLight and VRayVolumeGrid RnD


Cells Anatomy:

 

cell anatomy


Still of final cell image, using TP to create the endoplasmic reticulum and golgi complex

Callum: This is a great example of when I use TP for modelling. Here I needed to create a Endoplasmic Reticulum (part of the interior of a cell). Breaking the shape down I noticed it appeared as though the shape was formed of loops locking together, so I created a setup that creates loops between particles.

It is achieved with a simple PPassAB setup, where near particles are referenced together and spawn a 'child' that loops between them using the Orbit node. I use the Distance condition and some simple math to calculate the pivot and distance of the orbit. I can adjust the profile of the loop by mapping the time of the orbit into its distance via a ValueToValue

Myelin Sheath Damage

myelinsheath

Still of myelin damage shot. Everything in shot was created with the help of TP

Callum: This scene was an equal split of using TP for modelling and animation. The cells were created by first simulating some debris over a sphere in Bullet and taking that into ZBrush for a sculpting pass. The neurons were grown procedurally using the spline tools, with the advantage of being able to use the spline head (particle spawning the spline) for the signalling utilising VRayDistanceTex. The Myelin Sheath damage was simulated using BTSoftBody and its Autobreak parameter. This was then cached out using XMesh and positioned into place, with animation offset. Finally, a layer of particles were creating for the cytokines being emitted from the cells and damaging the myelin.

Cebas: Please let us know what are some future projects?

Callum: We have got a lot of exciting stuff upcoming! Recently I have been working with our interactive department on AR holograms for our showreel app. We were able to effortlessly export some TP scenes into an FBX file and import it into Unity to run on a mobile device. The new Unity update now supports FBX visibility, which is how TP’s Export node handles particles before spawning or after dying.

Long term for the company, we would like to look into doing educational content for schools and universities. I feel that CG animations can really help in teaching these processes compared to diagrams and walls of text. I am a very visual person, I find it a lot easier to understand something through an animation.

We are looking at expanding as a company over the next few years too, when I started just under 6 years ago we were around 12 people but now are in excess of 50 - the future is looking very bright for us at the moment.


More Random42 Exclusive Medical 3d models, Animations and Visual Effects @
Cebas Official Vimeopro  

 

Callum, Leo, thank you so much for sharing with us today. We are very please to hear of the many medical and scientific projects needing visual effects!  And we hope Thinking Particles and our ongoing updates to create ever more powerful operators and helpers for visualization and animation will continue to help your team represent the workings of microcellular activities in ever greater details.