Hi everyone,
I’m trying to run deepseek-coder:6.7b-instruct-q4_K_M
in Docker using Ollama to create an LLM that will be used by CrewAI agents. However, I’m encountering errors while initializing the SQL Query Generator Agent.
Here’s the error log:
2025-03-12 06:17:59,984 - INFO - Creating SQL Query Generator Agent
2025-03-12 06:17:59,991 - ERROR - Failed to get supported params: argument of type ‘NoneType’ is not iterable
ERROR - LiteLLM call failed: litellm.BadRequestError: LLM Provider NOT provided.
Pass in the LLM provider you are trying to call. You passed model=base_url=‘http://ollama:11434’ model=‘deepseek-coder:6.7b-instruct-q4_K_M’ temperature=0.0
- How should I properly specify the model provider?
- Is there any special configuration required to make CrewAI recognize this model?
- Has anyone successfully run this model in Docker with CrewAI?