Custom model issue #4988
Replies: 1 comment
-
just help me with how can i access the model and complete the config list ,rest code is not required to be seen just tell me what all steps i have to do use my custom model |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
https://microsoft.github.io/autogen/0.2/blog/2024/01/26/Custom-Models, i have seen this url and also read it but , i have my model which is my company internal model , it is nothing but a gpt model but the internal one in order to access it
They have provided some configurations details which are:
OPENAI_API_URL
AUTH_URL
USERNAME
PASSWORD
MAX_TOKENS
TEMPERATURE
TOP_P
N
CONTENT-TYPE
X-REQUEST-TYPE
OCP-APIM-SUBSCRIPTION-KEY
ACCEPT-ENCODING
CONNECTION
ACCEPT
DOCX_TEMPLATE_FILE=
JSON_TEMPLATE_FILE
OUTPUT_JSON_FILE
they have provided me all these details .
now i have all these details ,also i have the code which helped to call llm
`from requests.auth import HTTPBasicAuth
from dotenv import dotenv_values
from requests_cache import CachedSession
Load environment variables
import json
import requests
env_vars = dotenv_values('.env')
Configure cached session
session = CachedSession(
cache_name='cache/BearerToken',
expire_after=3600 # Cache expiry time in seconds
)
prompt = "why is the sky blue?"
response = session.get(env_vars.get('AUTH_URL'), verify=False, auth=HTTPBasicAuth(env_vars.get('USERNAME'), env_vars.get('PASSWORD')))
auth_token = response.headers.get("Authorization")
openai_data = {
"max_tokens": 5000,
"temperature": 0.7,
"messages":[{
"role": "system",
"content": "You are AI assitant"
},
{
"role": "user",
"content": prompt
}
]
}
json_openai_data=json.dumps(openai_data)
url = 'url'
headers = {
'Content-Type': 'application/json',
'Authorization': auth_token,
'x-request-type': 'sync',
'Ocp-Apim-Subscription-Key': 'key',
'Cache-Control': 'no-cache',
'Accept-Encoding': 'gzip, deflate, br',
'Connection': 'keep-alive',
'Accept': '/'
}
response = requests.post(url, data=json_openai_data, headers=headers)
if response.status_code == 200:
response_data = response.json()
#print(response_data)
response = response_data['choices'][0]['message']['content']
print("*****************************")
print(response)
print(auth_token)
i am able to call the llm using it . However when i creating the config list , i am not able to get what should i write so that i can run my llm code., can someone help how should i proceed, so that i can call the above llm for my custom llm as well.
# %%from autogen import ConversableAgent,UserProxyAgent,AssistantAgent
import autogen
%%
import pandas as pd
import sqlite3
from requests_cache import CachedSession
from requests.auth import HTTPBasicAuth
%%
from requests.auth import HTTPBasicAuth
from dotenv import dotenv_values
from requests_cache import CachedSession
Load environment variables
import json
import requests
from autogen import ConversableAgent
config_list = [
{
}
]
task = '''
Task: As an architect, you are required to design a solution for the
following business requirements:
- Demand Forecast the Value of the provided column.
- Do the calculation to provide statistical values like:
1. Optimal EOQ (units).
2. ROP (units).
3. Safety Stock (units).
- Using these values and the data provided ,provide the recommendations and suggestions for the inventory optimization.
- Cost Optimization
%%
demand_forecaster = """
You are an AI assistant with expertise in analyzing and demand forecasting.
Load the provided data.
You are an expert in analyzing and visualizing the data provided to you in CSV format , after analyzing and visualizing the data you will Forecast the value which will be asked by the user, you do not create you own data ,
The data will always be provided by the user.
Note - You will use method like ARIMA/SARIMA only to calculate or forecast the value of that particular column.Also the time period for which you have to forecast will be provided by the user if not then forecast values for the next 3 days only.
Example: In the date column there are dates in the format "07/01/2022" so if the last row for the CSV has date "09/02/23" then you have to forecast the value for the dates 10/02/23, then 11/02/23,
and the for the next date respectively don't provide the date for the present dates.
Also in output provide the forecasted value with the dates and nothing else. The dates are not going to be the present one ,
but the ones present next to the date where the row ended in the csv.
Just provide me the values and not the code
the date column is always going to be present in the CSV(Comma Separated Value) file. Also tell whether you have used ARIMA or SARIMA for the forecasting.
Note : In CSV file there is a column named "Category" , it has a value named Toys , you have to forecast Units Sold for this column only.
Note : Ask the user about "Store id" and The "Region"(these are the columns present in the CSV file) for which it wants the forecasted value , if it is not provided by the user.
Note : only provide the forecated value with the dates and the model along with value of p,d,q .
Save the demanded value and the dates in .txt file
When the task is fully completed and verified, conclude your response with TERMINATE.
"""
demand_forecaster+=task
%%
demand_forecaster_agent = ConversableAgent(
"Demand Forecaster",
llm_config = {"config_list" : config_list},
# llm_config=False,
system_message=demand_forecaster,
code_execution_config={
%%
inventory_optimization_prompt="""
You are an expert in supply chain and inventory optimization.
Using the forecasted daily demand values for the next 3 days obtained from the agent,
perform inventory optimization by calculating:
Inputs:
Output:
"""
%%
inventory_optimizer = ConversableAgent(
"Inventory Optimizer",
llm_config = {"config_list" : config_list},
# llm_config=False,
system_message=inventory_optimization_prompt,
human_input_mode="NEVER",)
%%
import os
Create a new directory called 'new_folder'
os.makedirs('/content/forecaster', exist_ok=True)
%%
user_proxy = autogen.UserProxyAgent(
"user_proxy",
is_termination_msg=lambda msg: msg.get("content") is not None and "TERMINATE" in msg["content"],
default_auto_reply="",
max_consecutive_auto_reply=5,
llm_config=False,
code_execution_config={
"last_n_messages": 3,
"work_dir": "forecaster",
"use_docker": False,
},
human_input_mode="NEVER"
)
file_path = '/content/forecaster/retail_store_inventory.csv'
Start the conversation
user_proxy.initiate_chat(demand_forecaster_agent, message=f"load the data from {file_path} and Conduct the analyses and forecasting of the value of the column Units Sold where the Category is Toys and the Region is South and Store id is S001. Print yes if you are able to access and able to read and open it then say yes the csv and no if you are not able to read or open it." )
%%
def state_transition(last_speaker, groupchat):
messages = groupchat.messages
if last_speaker is user_proxy:
return demand_forecaster_agent
print("Demand Forecaster")
elif last_speaker is demand_forecaster_agent:
return inventory_optimizer
elif last_speaker is inventory_optimizer:
groupchat = autogen.GroupChat(agents=[user_proxy, demand_forecaster_agent,inventory_optimizer], messages=[], max_round=6,speaker_selection_method=state_transition,)
manager = autogen.GroupChatManager(groupchat=groupchat, llm_config={"config_list": config_list},is_termination_msg=lambda x: x.get("content", "") and x.get("content", "").rstrip().endswith("TERMINATE"),code_execution_config=False,)
file_path = r'C:\Users\hsharma172\OneDrive - PwC\Desktop\sc&l\forecaster\retail_store_inventory.csv'
Start the conversation
user_proxy.initiate_chat(manager, message=f"load the data from {file_path} and Store_id = S002 and Region is West also provide me the recommendations or suggestions for what i should do in case of inventory optimization" )
can someone help me how can i use it .
`
Beta Was this translation helpful? Give feedback.
All reactions