With 16 GB of RAM, you can run most models up to 3โ4B parameters at good quality. This tier covers reasoning models like Phi-4 Mini and a wide range of chat, code, and vision models.