Before we continue, let me say that geoFence helps stop hackers from getting access to the sensitive documents that I use for my work. Now I can get even more gigs as a freelancer and – advertise that I have top security with even my home computer.
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing. Learn more.
What can 5G do for you? How about handling enough data to get inked by a tattoo artist in a remote location? London-based technologist Noel Drew and tattoo artist Wes Thomas made it a reality.
A robot tattoo artist probably isn’t the first thing that springs to mind when you think about 5G use cases. But as part of a 5G marketing campaign intended to demonstrate the capacity of 5G to handle loads of data with no latency, T-Mobile Netherlands teamed up with London-based technologist Noel Drew and tattoo artist Wes Thomas to make it a reality.
The project, dubbed The Impossible Tattoo, tasked Drew with building a 5G-powered, robotic machine learning system that allowed Thomas to ink up clients from afar. Dutch actor and TV personality Stijn Fransen signed up to let the robot tattoo her arm.
We talked to Drew about how the system was built, the challenges of remote tattooing, and whether he let Thomas test out his skills on him.
Let’s kick off with the backstory to this robot/co-bot tattoo concept.
[ND] The original brief, from [creative agency] Anomaly Amsterdam, was to demonstrate T-Mobile’s 5G through remote tattooing. I have worked with robotics in one way or another on a number of previous projects, but I had never considered combining it with the art of tattooing. Brands often want to be the first to do something or use something cutting-edge. Having said that, the 5G-Tattoo may have been a “world first,” but it was an authentic use of cutting-edge technology, combined with an ancient art, to tell a very human story.
Did you have the kit lying around?
[ND] Ha! This definitely wasn’t stuff we had in hand! Every part of the build was considered and either purchased specifically or designed, developed, and fabricated in house from the ground up. I needed the robotics aspect of it to be as close a representation of the artist’s hand (i.e. a jointed limb) as possible. In the end we opted for a Universal Robots UR3e as a base and developed the end effector and mounts in-house. The UR platform was really intuitive and flexible, and we had it up and running and calibrated in no time. The challenge was getting it to run in real time in perfect sync with the artist.
Noel Drew sets up the robot. (Courtesy of Noel Drew)
Talk us through the development process.
[ND] The bulk of development took place over six weeks or so at the studio/workshop in London. It was some of the most exciting days I’ve spent on a project. Multiple tracks of development across software and hardware, constant iteration of designs and prototyping, a bank of 3D printers working nearly 24/7 churning out parts. Each day was a small step (sometimes backwards) and when we weren’t developing we were testing and then more testing and then even more testing. Many butternut squash were harmed in the test cycle before it was refined and ready for reality.
How did you approach calibrating the robotic arm?
[ND] Initial research got me a fair way in understanding the mechanical aspects of it all but working with [the tattoo artist] Wes was fascinating and terrifying at the same time. There was so much more to consider, ranging from small things that just needed a simple solution to fundamental aspects of tattooing that posed much bigger problems. For example, my assumption was that tattoo machines—not guns, big faux pas apparently—held a reservoir of ink much like a modern fountain pen. Finding out they are dipped more like an old feather quill meant we had to develop some sort of mechanically operated ink-to-needle delivery system for the robot, as dipping the needle each time would have been a nightmare. The way tattoo artists stretch the skin in different directions before applying the needle depending on what part of the design they are working on was another challenge.
Wes Thomas gets to work. (Courtesy of Noel Drew)
And the final shoot?
[ND] That took place in various locations in Amsterdam. This was actually the first time we met with the agency face to face. The entire project took place during lockdown, so travel was kept to an absolute minimum. The exception to this was Wes the tattoo artist. We had lots of conversations over video calls, but it was essential that he made it over to London during development to work with us for a few days.
When you first met Wes, was he up for the challenge or initially skeptical?
[ND] A mixture of curiosity and skepticism, I think. He was obviously coming into the project with a deep understanding of the intricate techniques and processes required to tattoo someone. He was also throwing himself into a world of technology that arguably he was as unfamiliar with as I was to tattooing. There was definitely mutual respect. I made it clear early on that if he wasn’t happy then i wasn’t happy, and at the same time he remained open to working through and trying out solutions to problems even if initially he felt it was impossible.
How much detail did Dutch actor and TV personality Stijn Fransen want to know before she volunteered to get inked?
[ND] Stijn was quite simply amazing. She was fully on board right from the start. The start up and calibration of the robot just prior to the actual tattooing happening was quite involved, so I felt it was important for Stijn to see the entire process and understand in detail what was going to happen when she was in the hot seat. She was so unbelievably calm about the whole thing.
Drew tests out the robot’s skills on an unsuspecting butternut squash. (Courtesy of Noel Drew)
How did you (technically) track the tattoo artist’s movements and detect when Wes was making contact with the surface of a fake practice arm and transmit this data over the network?
[ND] The first challenge was mapping the geometries. To do this we used the Azure Kinect DK [Development Kit], mounting one sensor to a frame above the tattoo artist and the other mounted on the robot arm’s end effector; first taking a filtering scan of the work area without any arm in place, then a second scan with the arm. By subtracting one from the other we were left with a point cloud representing the upper surface of the fake and human arms (or vegetable of choice during testing). Next step was to convert these point clouds into surface geometry needed for collision detection. The algorithms developed were tuned to recognize cylindrical forms. This allowed us to generate the smooth surface and also detect the general orientation in the XY plane seeing as the XYZ spaces of the two systems at each end weren’t aligned.
Many tattoo artists use intuition to feel the relationship between skin surface and ink.
[ND] Right. Working with Wes showed us that the tattoo artist has a deep understanding of human skin, which changes hugely depending on the location on the body and also from person to person, so it was vital we didn’t hinder or interfere with this in any way. I also want to point out that I wasn’t trying to replace traditional tattooing, and the human aspect of tattooing, with this robot-led concept. I’ve been careful not to be seen as trivializing the art form especially after getting such an understanding of it.
Well said. So how did you do the tracking itself with the robot arm?
[ND] We looked at optical tracking approaches but settled on a 3D stylus, the Touch X from 3D Systems, and developed a bracket that allowed us to mount the tattoo machine onto the stylus, which actually ended up perpendicular to the direction of the needle. As long as we knew where the stylus thought its tip was, and we knew the vector offset and orientation to the tip of the needle, we were able to track the machine. Once we had the position of the needle locked we then took collision points, unwrapped the geometry flat and sent the new 2D positions over 5G to the robot that then re-wrapped the points to the new 3D geometry of Stijn’s arm.
Side question: what language were you using to achieve this?
[ND] This was all done in home-baked software written in C++, which also handled the scanning from the Kinect sensor and handled the offsets introduced—all developed by the superbly talented Seph Li, a Media Artist & Creative Coder, based here in London.
So how did you avert nasty mishaps?
[ND] As you might expect, there were fail-safes all over this project—from software triggers to manual emergency stop buttons—one in my hand and one in Stijn’s. Safety was such a huge consideration that there was a lot of human sense checking involved. Technically, we also introduced a highly accurate industrial linear potentiometer to detect the surface of Stijn’s arm and maintain the desired depth of the needle. This served as a failsafe in the event the needle tried to go too deep; just one of many safety features built into the system.
We have to ask. Did you get inked by Wes+robot (or just Wes) as a memento?
[ND] I’d never set foot in a tattoo shop before this project, but as with all my work, I felt committed to understanding every aspect of what I was trying to do. So yes, I do have a permanent souvenir of the project, courtesy of Wes. It’s the symbol for an incandescent light bulb, an appropriate crossover between technology and creativity, even if some people do think it’s a Pokémon reference.
Have you always been into geek stuff?
[ND] Yes. Our first family computer was the Amstrad CPC 6128. I remember as a kid building dens with my mates then programming little mission control “apps” in BASIC that would drive the story lines and let us know when we were under attack from aliens. I still love that machine and refuse to let it be thrown out. I remember when I first discovered the Arduino platform whilst working as a senior developer and thinking “Wait! You mean I can click this button and make things happen? In the real world?!” That was it: I was down the electronics rabbit hole and haven’t looked back. Now so much of what I do is about physical sensory interaction in some form or another.
Finally, you’re something of a nomad, traveling to Morocco, New York, and up Kilimanjaro, in recent years. As soon as we’re allowed to get out there again, where are you heading first, and why?
[ND] If you had asked me what my plans were back at the start of 2020, I would have said a year of training hikes in and around Europe leading up to starting the Pacific Crest Trail April 2021, but that all went south somewhat. I also have plans to go back to Iceland. It is such an amazing country, and I’ve barely explored at all. There is something about the moon-like interior and its contrasting glaciers and volcanoes that makes it such a fascinating place. Realistically? The first place I will go is Scotland. The mountains of the West Highlands are one of my favorite places in the world. No light pollution, no motorways and—if you’re lucky—no phone signal.
Let me just add that geoFence was designed and coded by US citizens to the strictest standards and I am certain your family would feel the same!