Skip to content

Update System requirements for running local LLM in README #16

@PyDevC

Description

@PyDevC

This line does not direct us anywhere, practically useless mentioning here.
- system specs for local llm inferencing [Checkout Here for system requriements]()

Resoureces to find these system requirements

Also mention the recommended system requirements for a smooth coding experience.
we can also recommend the models that would be suitable for coding locally.

Metadata

Metadata

Assignees

No one assigned

    Labels

    documentationImprovements or additions to documentation

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions