GPT-5.2 is live #458
zemaj
announced in
Announcements
Replies: 1 comment
-
|
Do you use any local models to save on rate-limit or API costs? I have a Mac with 128GB of RAM. Is there a workflow with Code that I could use? Or do you think its not worth it? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Seeing great results so far. Tool use with this model is A.M.A.Z.I.N.G. this is particularly important for Code since we push the range of tools available a bit further than most CLIs.
I recommend using GPT-5.2 High for most tasks - it's a great balance of speed and intelligence - it's even more token efficient than -max!
Beta Was this translation helpful? Give feedback.
All reactions