Skip to content

Commit 9c0644d

Browse files
authored
Update instructions for lessons searned
1 parent 4510cbe commit 9c0644d

File tree

1 file changed

+8
-4
lines changed

1 file changed

+8
-4
lines changed

content/modules/ROOT/pages/02-vllm.adoc

+8-4
Original file line numberDiff line numberDiff line change
@@ -137,6 +137,13 @@ oc get routes.serving.knative.dev -n composer-ai-apps
137137

138138
. Use the `Try it out` option of the `GET /v1/models` endpoint to list the models being deployed by this server. Note that the id for our model matches the name of the model server we created in the OpenShift AI Dashboard.
139139

140+
[WARNING]
141+
====
142+
Running into a 404 error on the OCP web console after trying this? Let us know to help identify a bug.
143+
144+
Possible fixes include switching wifi networks, switching to Incognito mode, or deleting cookies and cache.
145+
====
146+
140147
=== Testing the model from Composer AI UI
141148

142149
Now that we have done some basic testing we are ready to try the model from inside of the Composer AI Studio UI.
@@ -148,10 +155,7 @@ Our Composer instance is already setup to point to the vLLM endpoint we created
148155
+
149156
image::02-chatbot-route.png[Chatbot Route]
150157

151-
. In the top left hand corner select the `Default Assistant`
152-
153-
+
154-
image::02-default-assistant.png[Default Assistant]
158+
. Click on the Assistants on the left hand side, and choose the option to `Create Assistant`. Enter the name "Default Assistant", select the default LLM, and click `Create` without editing any of the fields.
155159

156160
. Ask a question in the UI to verify that the LLM is able to respond.
157161

0 commit comments

Comments
 (0)