Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!
December 15, 2023
A new type of 3D printer created by MIT researchers uses machine vision to monitor itself and create objects faster with a wider range of materials than traditional printers.
By doing so, engineers can use materials they could not use before, opening the door to more sophisticated and useful creations such as a robotic gripper shaped like a human hand and controlled by flexible and reinforced ‘tendons,’ according to the university.
With typical 3D printers, tiny nozzles of resin are deposited onto a surface and then smoothed with a scraper or roller and cured with UV light. But if the material cures slowly, the roller could squish or smear it. This limits the types of materials 3D printers can work with.
MIT’s new contactless 3D printing system, however, operates without the need for mechanical components to smooth the resin used to form objects, allowing it to work with materials that cure more slowly than the traditionally used acrylates in 3D printing.
Materials that cure more slowly often exhibit superior characteristics, such as enhanced elasticity, greater durability and improved longevity.
“We directly fabricated a wide range of complex high-resolution composite systems and robots: tendon-driven hands, pneumatically actuated walking manipulators, pumps that mimic a heart, and metamaterial structures,” the scientists from MIT, MIT spinout Inkbit and ETH Zurich wrote in a paper describing their work.
The new 3D printer can also print 660 times faster than comparable 3D inkjet printers, according to the team.
The study builds upon the MultiFab, an affordable, multi-material 3D printer the researchers initially introduced in 2015. MultiFab, equipped with thousands of nozzles, could deposit minuscule resin droplets that were then UV-cured, enabling high-resolution printing with up to 10 different materials simultaneously.
In their most recent project, the team focused on creating a non-contact printing technique to expand the range of materials used for fabricating more complex devices. They invented a method called vision-controlled jetting, integrating four high-frame-rate cameras and a pair of lasers to monitor the printing surface constantly. As the nozzles eject tiny resin droplets, the cameras capture these moments.
The system's computer vision transforms these images into a detailed depth map in less than a second. This map is compared against the CAD model of the item being produced, allowing for precise adjustments in resin output to align with the intended design. This automated process can fine-tune each of the printer's 16,000 nozzles, providing exceptional control over the minute details of the printed device.
The MIT project is part of a growing effort to use machine learning for 3D printing. AI-generated designs can immediately be realized in print, allowing for rapid prototyping and testing, Nat Trask, a professor of engineering at the University of Pennsylvania, said in an interview.
“In particular, for metamaterial design, folks print small, intricate, tiled patterns which combine to give a desirable bulk mechanical response,” Trask added. “While people have pursued this for a while, the patterns and geometry have been limited by the complexity that humans can reason through designs for. With generative models, the same tools as DALL-E that generate images of cats playing basketball on the moon, people can explore more intricate designs that can target different material responses.”
In the old way of designing things, a human would spend days building a computer model of a design and then running simulations based on solving large systems of equations, Trask said. In the last few years, machine learning (ML) has been able to replace these simulations with predictions 1,000 times faster.
“In the next few years, I'm expecting to see machine learning tools that can predict the behavior of a part ‘on the fly,’ allowing AI/ML to not just propose new print geometries but also have a feedback loop that explores designs using online physics models,” Trask said.
AI enables users to implement image-based process monitoring and closed-loop control more effectively, Ben Schrauwen, senior vice president and general manager of Oqton, a 3D printing company, noted in an interview. He said that the discovery and development of new polymers and alloys can be accelerated through the use of AI models that understand molecular and atomic structures and interactions.
“For example, you could have ChatGPT-like interfaces to interact with and ideate around large corpora of research literature,” he added. “AI is also being used to automatically recognize and segment 3D images of human anatomy and to automate the design of dental parts. Following this idea, AI can already automatically identify 3D models and suggest the optimal segmentation, how to place supports, orientation and nest parts for 3D printing.”
Oqton has created AI-based software that enables dental labs to automatically prepare files for 3D printing instead of relying on manual prep by operators and technicians. Automated file preparation can have a powerful impact on the overall manufacturing workflow.
“We have seen dental labs eliminate hours of manual work per day with AI-based automated support generation,” Schrauwen said. “The knock-on effect of this is that labs are printing more parts with the same number of machines. As more and more organizations and industries see that 3D printing is so cost-effective and fast, we can expect to see more of them use the technology.”
“The overall impact of AI on additive manufacturing (AM) workflows is that the process is more predictable, and technicians can send a job to be printed overnight and, in the morning, have the certainty that it will be ready for the next steps,” he continued.
AI is also playing a key role in improving quality control procedures, Schrauwen said. One example is the capability to oversee all 3D printing tasks from a single, unified viewpoint.
“This makes it easy to know the status of all jobs in the process, track changes in status, see video feeds of the build platform as jobs progress, and keep tabs on live sensor values,” he said. “Unlike a traditional Manufacturing Execution Systems (MES), a next-generation solution powered by AI can source information from various machines and applications in a unified experience that can detect problems and suggest solutions.”
Sascha Brodsky is a freelance technology writer based in New York City. His work has been published in The Atlantic, The Guardian, The Los Angeles Times, Reuters, and many other outlets. He graduated from Columbia University's Graduate School of Journalism and its School of International and Public Affairs.
You May Also Like
Generative AI Journeys with CDW UK's Chief TechnologistFeb 28, 2024
Qantm AI CEO on AI Strategy, Governance and Avoiding PitfallsFeb 14, 2024
Deloitte AI Institute Head: 5 Steps to Prepare Enterprises for an AI FutureJan 31, 2024
Athenahealth's Data Science Architect on Benefits of AI in Health CareJan 19, 2024