I find people problematic.
Not necessarily in real life, but as a scientific illustrator I’m most comfortable showing the functioning of a cell or interactions of subatomic particles. Creating a realistic human is a different matter.
When given the opportunity to illustrate the exosuit—a wearable device that assists mobility—featured on 16 August’s cover of Science, I knew the biggest challenge for me would be creating a realistic human figure in motion. I don’t usually illustrate people from the outside. Most of what I illustrate happens inside the body, and usually in a specific place. The extent to which a whole human shows up is often just as a “locator,” an outline or silhouette to indicate the work was done in a human, and/or happening in a certain body part. Therefore, I did not have a method for creating a 3D model of a human, let alone one worthy of being viewed at cover size.
The exosuit’s function guided a lot of my decisions for composing the illustration. The exosuit reported by Jinsoo Kim et al. assists the wearer by having a cable tug at the back of each leg at the moment of hip extension (leg behind the torso) during either walking or running. To capture my 3D person experiencing this cable tug, I needed a good pairing of viewing angle and moment in the run cycle. I needed the hip closest to the viewer to be in extension, but the back leg couldn’t be too obscured.
I went down a few avenues for getting my hands on a good quality model of a human figure that I could pose. My preference was to avoid spending time sculpting the figure myself. Browsing premade models online, I found plenty with unrealistic or overly idealized proportions, many wearing costumes. I wanted control over the shape of the person—an everyman, not a superman. This led me to find a couple of programs that let you generate, pose, and export 3D models of people. But both had their limitations relative to my project. One of them, once I weeded through all the ridiculous sci-fi and fantasy possibilities, gave me customization plus control over individual joints.
But my next challenge loomed: How to pose the figure in an accurate running position? I could kind of, sort of pose it manually, but I was also very interested in the moments when the gait changes between walking and running. (A key feature of the suit is its ability to accommodate both.) I wasn’t sure if there would be anything visually compelling about that motion, but I knew it wasn’t something I could make up. At this point, I felt that what I actually needed was an animated model.
The solution for this came from a tutorial that showed how to create a custom figure and animate it using motion capture data with Adobe Fuse and Mixamo. Of course! Motion capture would be the perfect way to get my figure in a realistic run; the motions look highly realistic because they are indeed from a real human.
Motion capture, or “mocap,” can be thought of as the digital analog to rotoscoping. A human actor’s movement is recorded using special cameras. A program transforms the data from these cameras into the virtual movements of joints, which can then be applied to a virtual figure. One could make a bunny do the macarena. Or a menacing alien slip on a banana peel. Mixamo had a few nice animations for walking and running that looked very natural on the 3D figure I built in Fuse. I could now just scrub through the animation, identify the timing when the exosuit would be active, play with the camera angle, and find that perfect view I was after.
Having spent a good amount of time with the supplemental movies from Jinsoo’s Harvard University lab, I recalled that their setup actually included motion capture cameras. I wondered if I could combine the above method with data from the very lab that did the work I was illustrating.
I was excited when Jinsoo rose to my request for some mocap data directly from their lab. He supplied me with a selection of running and walking at different speeds (such as the one below), along with starting out walking then running and vice versa. To my dismay, however, Mixamo does not let you import your own mocap data. This was the point where a cloud of technical difficulties started to outpace my imminent deadlines.
Ultimately, I was not able to perfectly connect Jinsoo’s mocap data to my model. I entered a cycle of hope and frustration as I would continually try things that seemed promising only to have it not work. At one point during this process I had to discover that my inquiry about whether the transition between walking and running looked interesting was … negative. It was becoming painfully clear that I simply could not figure out what was wrong and fix it in the time that I had remaining. I think every artist encounters moments like these, and the anxiety it induces is frankly terrifying. It paralyzes you from finding creative alternatives that might be hiding in plain sight. I ended up using the running mocap from Mixamo to create the cover image, because that is what worked most smoothly. Instead of showing any specific period of the walk-run gait cycle, we decided to go with a hint of general motion. I modeled the exosuit on the figure, referencing photos from Jinsoo. Jinsoo also provided me with a 3D model for the actuator, the chunk of machinery seen on the person’s back.
We also needed a variation of the cover image for the illustration leading the Perspective article covering this paper. For this, I pulled from some of my trials for the cover and I used Jinsoo’s animation of the person going from walking to running to produce the image. The biggest problem I was having was that the shoulders and arms did not look natural, so we simply cropped the image at the person’s waist, and I manually sculpted a more natural closed fist in the foreground. It’s not perfect, but it was a suitable salvage of all my work!
While the finished illustrations are solid pieces we can stand behind, this wouldn’t have happened without the support from Jinsoo and his team, and my colleagues on the Visuals team. I had higher ambitions for this project than I believe the finished work reflects. However, doing something creative means sometimes being hit with the realization that your ideas may have been too ambitious for your time and skills. Or, maybe you expected data to look different than it ended up being. Feeling helpless and discouraged is an inevitable symptom of venturing into new territory. Luckily, my teammates realize this, and with their advice we ended up with results we could feel good about. Thanks to this project I have a new method in my toolbox to continue putting into practice and improving upon. And, I feel a little better about people.
Valerie Altounian is a Senior Scientific Illustrator at Science.