A team of scientists from MIT and Facebook has made headway on an object tagging system that’s invisible to the naked eye. They plan to present their model, called InfraredTags, at the 2022 ACM CHI Conference on Human Factors in Computing Systems in April.
The name is a giveaway for how the tech works. InfraredTags uses infrared light-based barcodes and QR codes that are embedded permanently into the bodies of 3D printed objects. A corresponding phone attachment with infrared cameras can be used to detect the code.
There are many applications for object tagging. It’s been popularly used in packaging, tracking logistic, robotics, shopping, and recently, in augmented and virtual reality. Think of a grocery store checkout, where cashiers scan the barcode on a box of cereal to get its price, or when you use your phone at a restaurant to scan a QR code for a menu.
Invisible tags that are actually part of the object are appealing because they are unobtrusive and durable—and they can’t be wiped off or scratched. Some groups have already been exploring their own versions of unseen barcodes. For example, Columbia University’s Aircode has been inserting air pockets into 3D printed objects to fabricate their tag, and Microsoft’s InfraStructs combines the air pockets with a special camera. Columbia’s LayerCode has tested infrared dye codes with tiny 3D printed resin sculptures.
The MIT-led team wanted to create a tag that built off previous ideas but is relatively inexpensive, simple, and quick. They noted in their paper that both Aircode and InfraStructs took minutes to register and process the images they captured.
They created the tags, which look like regular barcodes or square QR codes, using an infrared-transmitting filament interspersed with air gaps. This filament appears opaque and normal in visible light but looks translucent in IR light. It was printed inside the walls of the 3D object, and they could place multiple tags all over the object.
The accompanying phone attachment has a microprocessor with an imaging module, a detection algorithm, and a paired infrared camera that illuminates the object’s surface to reveal the hidden tag. It can send a live stream to a web-based application that can be installed on the phone. And after detecting the tag, the web application will display information contained in the tag, like a web address, and also the location of the tag.
In their testing, they used the tag technology to adjust a smart thermostat without directly touching it, and get web-based information for a real-life object like a mug. One day, the team imagines that this could enable augmented reality interactions with objects and devices.
Check out what they look like below: