Skip to content

Commit

Permalink
feat: add o3 to model info; update chess example (#5311)
Browse files Browse the repository at this point in the history
Because.
  • Loading branch information
ekzhu authored Jan 31, 2025
1 parent 69d3af7 commit cd9dca4
Show file tree
Hide file tree
Showing 5 changed files with 21 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -20,12 +20,13 @@ class ModelFamily:

GPT_4O = "gpt-4o"
O1 = "o1"
O3 = "o3"
GPT_4 = "gpt-4"
GPT_35 = "gpt-35"
R1 = "r1"
UNKNOWN = "unknown"

ANY: TypeAlias = Literal["gpt-4o", "o1", "gpt-4", "gpt-35", "r1", "unknown"]
ANY: TypeAlias = Literal["gpt-4o", "o1", "o3", "gpt-4", "gpt-35", "r1", "unknown"]

def __new__(cls, *args: Any, **kwargs: Any) -> ModelFamily:
raise TypeError(f"{cls.__name__} is a namespace class and cannot be instantiated.")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@
# Based on: https://platform.openai.com/docs/models/continuous-model-upgrades
# This is a moving target, so correctness is checked by the model value returned by openai against expected values at runtime``
_MODEL_POINTERS = {
"o3-mini": "o3-mini-2025-01-31",
"o1": "o1-2024-12-17",
"o1-preview": "o1-preview-2024-09-12",
"o1-mini": "o1-mini-2024-09-12",
Expand All @@ -19,6 +20,12 @@
}

_MODEL_INFO: Dict[str, ModelInfo] = {
"o3-mini-2025-01-31": {
"vision": False,
"function_calling": True,
"json_output": True,
"family": ModelFamily.O3,
},
"o1-2024-12-17": {
"vision": False,
"function_calling": False,
Expand Down
1 change: 1 addition & 0 deletions python/samples/agentchat_chess_game/.gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
model_config.yml
model_config.yaml
11 changes: 10 additions & 1 deletion python/samples/agentchat_chess_game/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,16 @@ For example, to use `gpt-4o` model from OpenAI, you can use the following config
provider: autogen_ext.models.openai.OpenAIChatCompletionClient
config:
model: gpt-4o
api_key: REPLACE_WITH_YOUR_API_KEY
api_key: replace with your API key or skip it if you have environment variable OPENAI_API_KEY set
```
To use `o3-mini-2025-01-31` model from OpenAI, you can use the following configuration:

```yaml
provider: autogen_ext.models.openai.OpenAIChatCompletionClient
config:
model: o3-mini-2025-01-31
api_key: replace with your API key or skip it if you have environment variable OPENAI_API_KEY set
```

To use a locally hosted DeepSeek-R1:8b model using Ollama throught its compatibility endpoint,
Expand Down
2 changes: 1 addition & 1 deletion python/samples/agentchat_chess_game/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@

def create_ai_player() -> AssistantAgent:
# Load the model client from config.
with open("model_config.yml", "r") as f:
with open("model_config.yaml", "r") as f:
model_config = yaml.safe_load(f)
model_client = ChatCompletionClient.load_component(model_config)
# Create an agent that can use the model client.
Expand Down

0 comments on commit cd9dca4

Please sign in to comment.