Modern software applications increasingly include AI capabilities — answering questions, summarizing content, automating workflows, and more. But building AI apps that are reliable and useful requires understanding the underlying concepts.
In this Labspace, you'll learn the four core pillars of AI application development through hands-on exercises in a live environment.
By the end of this Labspace, you will have learned the following:
- Understand the Chat Completions API and how to structure messages for a model
- Use prompt engineering techniques including system prompts, few-shot examples, and structured output
- Implement tool calling and the agentic loop in code
- Build a RAG pipeline that grounds model responses in your own data
To launch the Labspace, run the following command:
docker compose -f oci://dockersamples/labspace-ai-fundamentals up -dAnd then open your browser to http://localhost:3030.
If you have the Labspace extension installed (docker extension install dockersamples/labspace-extension if not), you can also click this link to launch the Labspace.
If you find something wrong or something that needs to be updated, feel free to submit a PR. If you want to make a larger change, feel free to fork the repo into your own repository.
Important note: If you fork it, you will need to update the GHA workflow to point to your own Hub repo.
-
Clone this repo
-
Start the Labspace in content development mode:
# On Mac/Linux CONTENT_PATH=$PWD docker compose up --watch # On Windows with PowerShell $Env:CONTENT_PATH = (Get-Location).Path; docker compose up --watch
-
Open the Labspace at http://localhost:3030.
-
Make the necessary changes and validate they appear as you expect in the Labspace
Be sure to check out the docs for additional information and guidelines.