Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 15 additions & 2 deletions lightllm/server/api_http.py
Original file line number Diff line number Diff line change
Expand Up @@ -117,9 +117,22 @@ def set_args(self, args: StartArgs):
g_objs.app = app


def create_error_response(status_code: HTTPStatus, message: str) -> JSONResponse:
def create_error_response(
status_code: HTTPStatus, message: str, err_type: str = None, param: str = None
) -> JSONResponse:
if err_type is None:
if status_code.value >= 500:
err_type = "InternalServerError"
elif status_code == HTTPStatus.NOT_FOUND:
err_type = "NotFoundError"
else:
err_type = "BadRequestError"

g_objs.metric_client.counter_inc("lightllm_request_failure")
return JSONResponse({"message": message}, status_code=status_code.value)
return JSONResponse(
{"error": {"message": message, "type": err_type, "param": param, "code": status_code.value}},
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The OpenAI API specification typically expects the code field to be a string or null. Using an integer might cause issues with some client libraries that strictly validate types. Converting the status code to a string is a safer approach and matches the behavior of other OpenAI-compatible servers like vLLM.

Suggested change
{"error": {"message": message, "type": err_type, "param": param, "code": status_code.value}},
{"error": {"message": message, "type": err_type, "param": param, "code": str(status_code.value)}},

status_code=status_code.value,
)


@app.get("/liveness")
Expand Down
17 changes: 15 additions & 2 deletions lightllm/server/api_openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,11 +58,24 @@
logger = init_logger(__name__)


def create_error_response(status_code: HTTPStatus, message: str) -> JSONResponse:
def create_error_response(
status_code: HTTPStatus, message: str, err_type: str = None, param: str = None
) -> JSONResponse:
Comment on lines +61 to +63
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This function is a duplicate of the one implemented in api_http.py. Duplicating logic across files increases maintenance overhead and the risk of divergence. It is recommended to centralize this utility in a shared module. If circular dependencies between api_http.py and api_openai.py are a concern, consider moving the shared logic to a common utility file (e.g., lightllm/utils/api_utils.py) that both can import.

from .api_http import g_objs

if err_type is None:
if status_code.value >= 500:
err_type = "InternalServerError"
elif status_code == HTTPStatus.NOT_FOUND:
err_type = "NotFoundError"
else:
err_type = "BadRequestError"

g_objs.metric_client.counter_inc("lightllm_request_failure")
return JSONResponse({"message": message}, status_code=status_code.value)
return JSONResponse(
{"error": {"message": message, "type": err_type, "param": param, "code": status_code.value}},
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The code field should be a string to align with the OpenAI API standard, ensuring consistency with the implementation in api_http.py.

Suggested change
{"error": {"message": message, "type": err_type, "param": param, "code": status_code.value}},
{"error": {"message": message, "type": err_type, "param": param, "code": str(status_code.value)}},

status_code=status_code.value,
)


def _process_tool_call_id(
Expand Down
Loading