Add explicit codable implementation to ChatMessage and new OpenAIEndpointModelType #107
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fixes #100
Add explicit codable implementation to ChatMessagewithout removing theIdentifiableconformance by implementing an explicit codable implementation that ignores theidproperty when encoding, and creates a new UUID when decoding.This should fix the regular chat and streaming chat as both have not been working with the encoded
id.I've also added a new
OpenAIEndpointModelTypewith the currently available OpenAI models, to replace theOpenAIModelType. The currentOpenAIModelTypemodel enum and methods that have models as parameters could easily cause errors as any model could be chosen for any of those methods. By adding the newOpenAIEndpointModeltype that corresponds to the compatibility list in the OpenAI docs, it’s much easier to select the fitting model.The new type does contain duplicate model strings, which is intended to make it easier to maintain.
Backward compatibility is assured by marking the previous methods as deprecated. and using the old methods as wrappers around the new methods.