
Boffins improve AI for robots
PaLM-E improves Robotic vision and language
AI researchers from Google and the Technical University of Berlin unveiled PaLM-E, a multimodal embodied visual-language model (VLM) with 562 billion parameters that integrates vision and language for robotic control.