-
Notifications
You must be signed in to change notification settings - Fork 45
chore(error-handling): add getErrorMessage util and use it in Ollama #47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
chore(error-handling): add getErrorMessage util and use it in Ollama #47
Conversation
… and dataset; add tests
…(safer error handling)
Summary of ChangesHello @morningstarxcdcode, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request introduces a comprehensive backend infrastructure for a new application, 'LocalMind'. It sets up a Node.js/Express/TypeScript project with MongoDB integration, robust user authentication including API key generation, and flexible AI model management. The system is designed to integrate with various AI providers like Ollama, Google Gemini, and Groq, allowing users to configure and interact with these models. Additionally, it includes features for processing datasets and a new utility for standardized error message extraction. The scope of changes is significantly broader than indicated by the PR title, effectively initializing a full-fledged backend service. Highlights
🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console. Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces a substantial amount of new code, essentially setting up an entire backend application. The title "chore(error-handling): add getErrorMessage util and use it in Ollama" does not reflect the scope of these changes. While the new error handling utility is a good addition, the PR includes multiple new features like user authentication, AI model configuration, and dataset processing. In the future, it would be highly beneficial to break down such large changes into smaller, more focused pull requests. This makes the review process more manageable and allows for more detailed feedback. My review focuses on architectural patterns, correctness, and potential bugs I've identified across the new modules.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This pull request adds a getErrorMessage utility function to safely extract error messages from unknown error types and updates the Ollama service and utility files to use this helper instead of directly accessing properties on any-typed errors in catch blocks.
Key Changes:
- Introduces
src/utils/error.util.tswith agetErrorMessagehelper that safely extracts error messages - Updates
src/api/v1/Ai-model/Ollama/Ollama.service.tsto usegetErrorMessagein catch blocks - Updates
src/api/v1/Ai-model/Ollama/Ollama.utils.tsto usegetErrorMessagein catch blocks
Reviewed changes
Copilot reviewed 58 out of 66 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| src/utils/error.util.ts | New utility function to safely extract error messages from unknown error types |
| src/api/v1/Ai-model/Ollama/Ollama.service.ts | Updated to use getErrorMessage helper in both getVector and generateText methods |
| src/api/v1/Ai-model/Ollama/Ollama.utils.ts | Updated to use getErrorMessage helper in isModelAvailable and listAvailableModels methods |
| src/utils/test/safeJson.util.test.ts | Test file for JSON parsing utility (reference for testing patterns) |
| src/api/v1/Ai-model/Ollama/test/Ollama.test.ts | Tests for Ollama service functionality |
| [Other files] | Many additional files appear to be part of initial codebase setup rather than error handling changes |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
@morningstarxcdcode more than 10,000 line you Edit can you explain your changes |
Fixes #46
Summary: Adds helper and updates Ollama utils/service to safely extract error messages and avoid in catch clauses.
Files changed: , , .