希望在双容器环境下支持 CNB_WELCOME_CMD#3604
$: vscode: - docker: image: mirror.ccs.tencentyun.com/ollama/ollama:latest runner: tags: cnb:arch:amd64:gpu services: - vscode - docker env: OLLAMA_MODELS: /workspace/models OLLAMA_HOST: 0.0.0.0 OLLAMA_ORIGINS: '*' CNB_WELCOME_CMD: bash stages: - name: start ollama script: nohup ollama serve >/dev/null 2>&1 &
妲己
这个特性将解决什么问题?
设想的解决方案?如有