Technology company Roboflow have announced that Meta's Segment Anything Model (SAM) has saved its community an estimated 74 years of manual image processing time, transforming a previously tedious object segmentation process.
The AI technology has been used to process over 60 million polygons across Roboflow's platform, finding applications in disaster recovery efforts, live sports instant replays, and insurance claims processing using aerial imagery.
Meta's release of SAM 2 in July 2024 expanded the technology's capabilities to enable real-time, promptable object segmentation in both images and videos.
With over 500,000 publicly available datasets across approximately 350 million user-labeled images on Roboflow Universe, the technology has found diverse applications. "Our customers produce electric vehicles, move critical goods across the US, produce ice cream—and many of them use SAM in manufacturing and logistics processes, ensuring their products meet their high quality standards before reaching consumers," Nelson is quoted as saying.
The open-source nature of SAM has enabled continuous improvement through community feedback. The collaboration between the SAM team, researchers, and users in the AI and broader tech communities, has led to significant improvements between SAM 1 and SAM 2.
Real world applications are highlighted. At the Exploratorium museum in San Francisco, visitors use it to examine microscopic organisms, while researchers employ it to monitor fish populations and assess coral reef restoration efforts.