/u/yurenh2/miniforge3/envs/eval/lib/python3.11/site-packages/transformers/utils/hub.py:110: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead. warnings.warn( `torch_dtype` is deprecated! Use `dtype` instead! Traceback (most recent call last): File "/projects/bfqt/users/yurenh2/ml-projects/personalization-user-model/scripts/test_reward_comparison.py", line 382, in main() File "/projects/bfqt/users/yurenh2/ml-projects/personalization-user-model/scripts/test_reward_comparison.py", line 378, in main asyncio.run(run_comparison(args.local_model, args.device)) File "/u/yurenh2/miniforge3/envs/eval/lib/python3.11/asyncio/runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "/u/yurenh2/miniforge3/envs/eval/lib/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/u/yurenh2/miniforge3/envs/eval/lib/python3.11/asyncio/base_events.py", line 654, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "/projects/bfqt/users/yurenh2/ml-projects/personalization-user-model/scripts/test_reward_comparison.py", line 283, in run_comparison gpt_result, gpt_raw = await gpt_judge.judge( ^^^^^^^^^^^^^^^^^^^^^^ File "/projects/bfqt/users/yurenh2/ml-projects/personalization-user-model/scripts/test_reward_comparison.py", line 242, in judge response = await self.client.chat.completions.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/u/yurenh2/miniforge3/envs/eval/lib/python3.11/site-packages/openai/resources/chat/completions/completions.py", line 2678, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "/u/yurenh2/miniforge3/envs/eval/lib/python3.11/site-packages/openai/_base_client.py", line 1797, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/u/yurenh2/miniforge3/envs/eval/lib/python3.11/site-packages/openai/_base_client.py", line 1597, in request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported value: 'temperature' does not support 0.1 with this model. Only the default (1) value is supported.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_value'}}