Imagine being in your shop and telling a robot to get you a hammer. It can understand the instructions, look at the environment using an inverse image model, synthesize the environment with the instruction, delegate to path finders etc a route to the tool box and retrieve you a hammer without ever being programmed anything. Now abstract you to being other machine agents in the environment of a workshop being instructed goals along some plan. Today factory robots are highly specialized machines that require a very finite and controlled problem space and reconfiguration is expensive and time consuming, if possible at all.
Thank you very much @fnordpiglet for addressing my questions directly and thoroughly! Makes a lot of sense. I still don't understand why a16z wants us to do that though. what's in it for them?