Users want to seamlessly integrate Ollama for self-hosted model prototyping and local development, requesting a dedicated option in the SAM initialization GUI to avoid manual .env configuration.
### Feature Description The ability to use Ollama without having to manually configure the .env to not say `openai/MODELNAME` would be nice. Specifically a dedicated option in the SAM initialization GUI. ### Use Case I'm using self-hosted Ollama for prototyping and local development without relying on externally hosted models. Having a seamless integration into SAM would be useful. ### Proposed Solution Solve is relatively simple, just an additional option in the init GUI dedicated to Ollama -- not sure if the API key is still necessary in this case.