Meta
"Meta's new robotic fingertips can feel the lightest touch. Their testing system puts robots through everyday tasks in virtual homes with thousands of items."
"Meta partners with Reuters while NYT litigates, showing data access divide: only AI companies with 'deep pockets' can afford premium content deals."
"Quantised Llama 3.2 achieves 56% size reduction using QLoRA and SpinQuant, with 4-bit weights and 8-bit activations for mobile deployment."
"SAM 2 enables real-time object segmentation, processing 60M+ polygons across 350M images for manufacturing QA and scientific research."
Meta's Llama model uses split inference: 1st layer processes on user devices, remaining 31 in cloud, enabling privacy while avoiding quantisation
Meta FAIR releases SAM 2.1 for improved image segmentation, Spirit LM for multimodal speech/text, Layer Skip for faster LLM inference, and tools for cryptography.