Skip to content

feat(AT-39): add lora by user from interface#9

Open
KillianBeuf wants to merge 5 commits intomainfrom
feat/AT-39-add-lora-by-user-from-interface
Open

feat(AT-39): add lora by user from interface#9
KillianBeuf wants to merge 5 commits intomainfrom
feat/AT-39-add-lora-by-user-from-interface

Conversation

@KillianBeuf
Copy link
Collaborator

@KillianBeuf KillianBeuf commented Jun 13, 2025

Summary by CodeRabbit

  • New Features

    • Added user-specific handling and support for uploading LoRA model files directly through the interface, with immediate updates to model and LoRA selections.
    • Introduced a new command-line option to specify custom paths for LoRA models.
  • Improvements

    • Enhanced LoRA file management by organizing files into a "common" subdirectory and supporting user-specific directories.
    • Broadened search paths for LoRA files to include both common and performance directories.
    • Updated preset configurations to reflect the new LoRA directory structure.
  • Bug Fixes

    • Removed unnecessary debug output from authentication checks.
  • Documentation

    • Reformatted JSON preset files for improved readability and consistency.

@coderabbitai
Copy link

coderabbitai bot commented Jun 13, 2025

Walkthrough

The changes introduce user-specific LoRA file management and upload functionality. LoRA paths are restructured to include "common" and per-user subdirectories, with configuration and model selection logic updated accordingly. The web UI now allows authenticated users to upload LoRA files, which are stored in user-specific folders, and the UI dynamically refreshes to reflect uploaded files.

Changes

File(s) Change Summary
webui.py Added user-aware LoRA upload and dynamic refresh of model/LoRA dropdowns; introduced functions for handling uploads and UI updates; removed old refresh logic.
ldm_patched/modules/utils.py Added move_lora(filepath, username) to move LoRA files to user-specific directories under the first LoRA path.
ldm_patched/modules/args_parser.py Added --path-loras argument for specifying LoRA paths via command line.
modules/config.py Introduced get_loras_path() for resolving LoRA paths, supporting user-specific and "common" directories; updated LoRA download and file update logic to use new structure.
modules/core.py Modified LoRA file search in refresh_loras to include both standard and performance LoRA paths.
modules/model_structure.py Changed LoRA directory creation to nest under "common" subdirectory.
simpleai_base/models_info.py Updated get_model_names to filter LoRA files by user and "common" directories, supporting authentication context.
modules/auth.py Removed debug print statements from authentication logic.
launch.py Updated LoRA download logic to use "common" subdirectories within LoRA paths.
presets/*.json Reformatted for compactness; updated LoRA file paths to include "common\" prefix where applicable.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant WebUI
    participant move_lora (utils)
    participant Config/ModelInfo

    User->>WebUI: Upload LoRA file (.safetensors/.ckpt)
    WebUI->>move_lora: move_lora(filepath, username)
    move_lora->>Config/ModelInfo: Store file in user-specific LoRA directory
    move_lora-->>WebUI: Success/Failure
    WebUI->>Config/ModelInfo: Refresh LoRA/model dropdowns with user context
    Config/ModelInfo-->>WebUI: Updated file lists (including user and "common" LoRAs)
    WebUI-->>User: Updated UI reflecting new LoRA file
Loading

Assessment against linked issues

Objective (Issue #) Addressed Explanation
Add possibility to add LoRA by user from the interface (AT-39)

Assessment against linked issues: Out-of-scope changes

Code Change Explanation
Reformatting and compacting of JSON preset files (presets/*.json) These changes are formatting-only and do not directly relate to the user-upload or user-specific LoRA management objective. However, they do not impact functionality.

Poem

A rabbit hopped with files to store,
Now users upload LoRAs—oh, what a score!
Each bunny gets their special place,
With folders neat in every case.
The UI refreshes, files appear—
Personalized LoRAs, let’s all cheer!
🐇✨

✨ Finishing Touches
  • 📝 Generate Docstrings

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 10

🧹 Nitpick comments (14)
ldm_patched/modules/args_parser.py (1)

43-43: Add help text for new --path-loras argument
The --path-loras option currently lacks a help description and metavar, which impacts CLI discoverability. Please include a descriptive help string (e.g., help="Custom LoRA models base path") and consider adding a metavar for clarity.

presets/4GB_Default.json (1)

8-9: Back-slash in JSON paths couples presets to Windows paths

"common\\sd_xl_offset_example-lora_1.0.safetensors" works on Windows but is a literal
back-slash on POSIX. All runtime code uses os.path.join, so keep JSON portable by using a
forward slash (/) or no separator at all (and let the loader prepend "common").

Example:

[ true, "common/sd_xl_offset_example-lora_1.0.safetensors", 0.5 ]

Apply the same convention to every preset touched in this PR.

presets/Flux1D_GGUF.json (1)

26-29: Platform-neutral separator recommended

Same concern as above: "common\\Flux\\Hyper-FLUX.1-dev-8steps-lora.safetensors" embeds Windows
separators. Prefer / or let the runtime build the path.

presets/RealVisXL.json (1)

6-9: Use POSIX separators or leave path building to code

All three modified LoRA paths contain \\. This will create files with back-slashes in their
names on Linux. Align with the cross-platform advice given for the other presets.

presets/HyperFlux8.json (1)

26-32: Hard-coded back-slashes reduce portability

The default-LoRA paths embed Windows back-slashes; on Linux/macOS these are just literal characters and path resolution may fail. Consider storing paths with forward-slashes or resolving them with os.path.join at runtime instead of baking a platform-specific separator into the JSON.

presets/Juggernaut.json (1)

7-7: Same portability note about back-slashes

See previous comment – favour forward-slashes to keep presets platform-neutral.

presets/Flux1S_GGUF.json (1)

27-28: Back-slash paths hinder cross-platform usage

Same remark as above – prefer forward-slashes or build paths at runtime.

presets/Flux1D_8GGUF.json (1)

27-28: Back-slash paths hinder cross-platform usage

See earlier comment; unify path separators for portability.

presets/HyperFlux.json (1)

26-32: Prefer POSIX-style forward slashes for cross-platform paths

The new LoRA entries use back-slash separators ("common\\Flux\\...").
Most of the Python code later treats model paths as raw strings and then feeds them to Path/os.path, which works on Windows but breaks on POSIX when the JSON value is re-used verbatim (e.g. in HTTP APIs).

-    [false, "common\\Flux\\FluxDFaeTasticDetails.safetensors", 0.8],
-    [false, "common\\Flux\\Flux_BlackColor_SaMay.safetensors", 0.7],
+    [false, "common/Flux/FluxDFaeTasticDetails.safetensors", 0.8],
+    [false, "common/Flux/Flux_BlackColor_SaMay.safetensors", 0.7],

Using / keeps the preset portable for every runtime.

simpleai_base/models_info.py (1)

689-696: Clarify filter condition & adopt membership check

The current expression relies on operator precedence and is hard to read:

if model_type != "common" and model_type != user or model_type == "performance":

Parenthesise and use not in for clarity (also silences the Ruff/Pylint hints):

-                    if model_type != "common" and model_type != user or model_type == "performance":
+                    if model_type == "performance" or model_type not in {"common", user}:
                         continue

Behaviour is unchanged but intent is explicit.

🧰 Tools
🪛 Pylint (3.3.7)

[refactor] 692-692: Consider merging these comparisons with 'in' by using 'model_type not in ('common', user)'. Use a set instead if elements are hashable.

(R1714)

webui.py (3)

890-904: Refresh logic duplicated – extract a helper

refresh_files_clicked and refresh_after_lora_upload repeat almost identical blocks (model/LoRA list rebuild).
Move the shared code into a single utility to avoid future drift and ease maintenance.

🧰 Tools
🪛 Ruff (0.11.9)

899-899: Loop control variable i not used within loop body

Rename unused i to _i

(B007)


870-873: Loop index unused

for i in range(...):i is never referenced; rename to _ or drop the variable to silence linters and clarify intent.

🧰 Tools
🪛 Ruff (0.11.9)

870-870: Loop control variable i not used within loop body

Rename unused i to _i

(B007)


898-901: Same unused loop variable as above

Apply the same rename here for consistency.

🧰 Tools
🪛 Ruff (0.11.9)

899-899: Loop control variable i not used within loop body

Rename unused i to _i

(B007)

modules/config.py (1)

1026-1031: update_files() signature/return mismatch & silent parameter propagation

Good call exposing a user parameter, but:

  1. The docstring (missing) and callers need updating; otherwise devs won’t know the additional filter exists.
  2. The function still returns only three lists even though it populates wildcard_filenames—makes the global harder to reason about. Either return it or leave a comment why it’s global-only.

Minor, yet worth tightening for maintainability.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3561505 and ac49166.

📒 Files selected for processing (36)
  • launch.py (1 hunks)
  • ldm_patched/modules/args_parser.py (1 hunks)
  • ldm_patched/modules/utils.py (2 hunks)
  • modules/auth.py (0 hunks)
  • modules/config.py (4 hunks)
  • modules/core.py (1 hunks)
  • modules/model_structure.py (1 hunks)
  • presets/4GB_Default.json (1 hunks)
  • presets/4GB_Pony.json (1 hunks)
  • presets/AnimaPencil.json (1 hunks)
  • presets/Cheyenne.json (1 hunks)
  • presets/Flux1D.json (1 hunks)
  • presets/Flux1D_8GGUF.json (1 hunks)
  • presets/Flux1D_GGUF.json (1 hunks)
  • presets/Flux1S.json (1 hunks)
  • presets/Flux1S_GGUF.json (1 hunks)
  • presets/HyDiT.json (1 hunks)
  • presets/HyperFlux.json (1 hunks)
  • presets/HyperFlux16.json (1 hunks)
  • presets/HyperFlux8.json (1 hunks)
  • presets/Juggernaut.json (1 hunks)
  • presets/Kolors.json (1 hunks)
  • presets/KolorsFast.json (1 hunks)
  • presets/KolorsTile.json (1 hunks)
  • presets/Magick.json (1 hunks)
  • presets/Playground.json (1 hunks)
  • presets/Pony.json (1 hunks)
  • presets/PonyFaeTality.json (1 hunks)
  • presets/RealVisXL.json (1 hunks)
  • presets/RealisticPhoto.json (1 hunks)
  • presets/SD1.5_RealVis.json (1 hunks)
  • presets/SDXL.json (1 hunks)
  • presets/Tile.json (1 hunks)
  • presets/default.json (1 hunks)
  • simpleai_base/models_info.py (2 hunks)
  • webui.py (3 hunks)
💤 Files with no reviewable changes (1)
  • modules/auth.py
🧰 Additional context used
🧬 Code Graph Analysis (1)
launch.py (1)
modules/util.py (1)
  • get_file_from_folder_list (402-411)
🪛 Ruff (0.11.9)
simpleai_base/models_info.py

685-685: Use key in dict instead of key in dict.keys()

Remove .keys()

(SIM118)

webui.py

870-870: Loop control variable i not used within loop body

Rename unused i to _i

(B007)


899-899: Loop control variable i not used within loop body

Rename unused i to _i

(B007)

🪛 Pylint (3.3.7)
simpleai_base/models_info.py

[refactor] 680-680: Too many arguments (6/5)

(R0913)


[refactor] 680-680: Too many positional arguments (6/5)

(R0917)


[refactor] 692-692: Consider merging these comparisons with 'in' by using 'model_type not in ('common', user)'. Use a set instead if elements are hashable.

(R1714)


[refactor] 680-680: Too many branches (14/12)

(R0912)

🔇 Additional comments (20)
presets/Pony.json (1)

2-33: Formatting-only change: The JSON was compacted and re-indented for consistency without any semantic or path modifications. All keys, values, and download URLs remain unchanged.

presets/Tile.json (1)

2-54: Stylistic JSON reformat only: The file was restructured to a more compact, single-line format for objects and arrays. No functional or path changes were introduced.

presets/AnimaPencil.json (1)

2-41: Stylistic JSON reformat only: All LoRA, model, and download entries remain identical; only whitespace and line breaks were adjusted.

presets/PonyFaeTality.json (1)

2-31: Stylistic JSON reformat only: Formatting was condensed without altering any data or default settings.

presets/Cheyenne.json (1)

2-32: Reformat-only changes, no semantic modifications.
These updates only adjust whitespace and indentation; the data remains unchanged.

presets/default.json (1)

7-8: LoRA path prefix updated correctly.
The two entries now include the common\ prefix to match the new directory structure and updated loading logic.

presets/SD1.5_RealVis.json (1)

2-38: Reformat-only changes, no semantic modifications.
These updates only adjust whitespace and indentation; the data remains unchanged.

presets/4GB_Pony.json (1)

2-30: Reformat-only changes, no semantic modifications.
These updates only adjust whitespace and indentation; the data remains unchanged.

presets/HyDiT.json (1)

1-40: Reformat-only changes, no semantic modifications.
These updates only adjust whitespace and indentation; the data remains unchanged.

presets/KolorsTile.json (1)

2-45: Formatting-only changes
The JSON has been reformatted for compactness and consistent 2-space indentation. No keys, values, or semantic content were altered.

presets/Magick.json (1)

7-8: Validate LoRA path prefix & cross-platform handling
The first two entries in default_loras now correctly include the "common\\" prefix. Ensure your loader normalizes these backslashes on non-Windows systems or consider using forward slashes for portability.

presets/Kolors.json (1)

2-28: Formatting-only changes
Whitespace and line breaks were condensed; all keys and values remain unchanged.

presets/Flux1S.json (1)

2-50: Formatting-only changes
Reformatted arrays/objects into more compact inline form without modifying any semantic data.

presets/Playground.json (1)

7-8: Validate LoRA path prefix & cross-platform handling
The first two default_loras paths have been updated to include "common\\". Confirm the application handles these backslashes on POSIX or switch to / separators for consistency.

presets/KolorsFast.json (1)

23-23: Verify the "common" prefix and path separators in LoRA paths.
The LoRA reference was updated to "common\\Hyper-SDXL-8steps-lora.safetensors". Ensure that the JSON parser and file-loading logic handle Windows-style backslashes correctly across all platforms, or consider using forward slashes ("common/Hyper-SDXL-8steps-lora.safetensors") for platform-agnostic paths.

presets/Flux1D.json (1)

1-51: Approve formatting-only changes.
This file only underwent whitespace and indentation adjustments without any semantic modifications.

presets/SDXL.json (1)

6-12: Validate updated LoRA path prefix.
The default LoRA entry now points to "common\\sd_xl_offset_example-lora_1.0.safetensors". Confirm that the loader supports this "common" subdirectory and handles backslashes correctly, or switch to forward slashes for consistency.

presets/RealisticPhoto.json (1)

6-12: Check consistency of LoRA path prefix.
The first LoRA model path was changed to "common\\SDXL_FILM_PHOTOGRAPHY_STYLE_V1.safetensors". Verify that this matches the directory structure created by modules/model_structure.py and is accessible at runtime.

presets/HyperFlux16.json (1)

26-27: Confirm new "common" directory for multiple LoRA entries.
Entries for "common\\Flux\\FluxDFaeTasticDetails.safetensors" and "common\\Flux\\AntiBlur.safetensors" were prefixed accordingly. Ensure that nested subdirectories under "common" are created and the search logic in modules/core.py covers these paths.

simpleai_base/models_info.py (1)

680-683: Good fix – replaced mutable default with None

Switching the default value of filters from [] to None prevents the classic mutable-default foot-gun.
No action needed.

🧰 Tools
🪛 Pylint (3.3.7)

[refactor] 680-680: Too many arguments (6/5)

(R0913)


[refactor] 680-680: Too many positional arguments (6/5)

(R0917)


[refactor] 680-680: Too many branches (14/12)

(R0912)

Comment on lines +147 to 149
common_paths = [os.path.join(path, 'common') for path in config.paths_loras]
model_dir = os.path.dirname(get_file_from_folder_list(file_name, common_paths))
load_file_from_url(url=url, model_dir=model_dir, file_name=file_name)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

❓ Verification inconclusive

Hard-coding the "common" sub-folder scatters path logic and risks duplication

The same "common" suffix is now appended here, in modules/model_structure.create_model_structure(), and (implicitly) in several preset JSON files. When the convention changes again (e.g. per-user folders, a different tier like "public"), every call-site must be audited.

-        common_paths = [os.path.join(path, 'common') for path in config.paths_loras]
+        # keep the “where are my LoRAs?” rule in one place
+        common_paths = config.get_common_lora_paths()

Consider adding a helper such as config.get_common_lora_paths() (or re-using one if it already exists) so the rule lives in exactly one module.


Extract “common” path logic into a single helper

To avoid scattering the "common" suffix in multiple places, centralize it in one method on config.

• In launch.py (lines 147–149) and in modules/model_structure.py you’re appending "common" to each entry in config.paths_loras.
• Preset JSON files also hard‐code the "common" folder.

Proposed changes:

  1. Add to your config class (e.g. in config.py):
    def get_common_lora_paths(self) -> list[str]:
        """Return each LORA root path with the 'common' subfolder appended."""
        return [os.path.join(path, "common") for path in self.paths_loras]
  1. Replace both call‐sites:
-   common_paths = [os.path.join(path, 'common') for path in config.paths_loras]
+   common_paths = config.get_common_lora_paths()

This keeps the convention in one place and prevents divergence when folder tiers change.

🤖 Prompt for AI Agents
In launch.py around lines 147 to 149, the "common" subfolder is hard-coded when
constructing paths, which duplicates logic also found in
modules/model_structure.py and preset JSON files. To fix this, add a helper
method get_common_lora_paths() to the config class that returns paths_loras
entries with "common" appended, then replace the list comprehension in launch.py
and the similar code in model_structure.py with calls to this new helper. This
centralizes the "common" path logic and prevents inconsistencies.

Comment on lines +14 to +18
os.makedirs(config.paths_loras[0] + '\common\Alternative', exist_ok=True)
os.makedirs(config.paths_loras[0] + '\common\Flux', exist_ok=True)
os.makedirs(config.paths_loras[0] + '\common\Pony', exist_ok=True)
os.makedirs(config.paths_loras[0] + '\common\SD1.5', exist_ok=True)
os.makedirs(config.paths_loras[0] + '\common\SD3x', exist_ok=True)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Windows-only path separators will break on Linux / macOS

os.makedirs(config.paths_loras[0] + '\common\Alternative', …) embeds back-slashes.
On POSIX these are literal characters, so the code creates a directory literally named
“…/loras\common\Alternative”, and later look-ups using os.path.join(base, "common", "Alternative")
will not find it.

-os.makedirs(config.paths_loras[0] + '\common\Alternative', exist_ok=True)        
-os.makedirs(config.paths_loras[0] + '\common\Flux', exist_ok=True)
-
+base = config.paths_loras[0]
+for sub in ("Alternative", "Flux", "Pony", "SD1.5", "SD3x"):
+    os.makedirs(os.path.join(base, "common", sub), exist_ok=True)

Please replace the string concatenation with os.path.join (or pathlib) for full cross-platform support.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
os.makedirs(config.paths_loras[0] + '\common\Alternative', exist_ok=True)
os.makedirs(config.paths_loras[0] + '\common\Flux', exist_ok=True)
os.makedirs(config.paths_loras[0] + '\common\Pony', exist_ok=True)
os.makedirs(config.paths_loras[0] + '\common\SD1.5', exist_ok=True)
os.makedirs(config.paths_loras[0] + '\common\SD3x', exist_ok=True)
base = config.paths_loras[0]
for sub in ("Alternative", "Flux", "Pony", "SD1.5", "SD3x"):
os.makedirs(os.path.join(base, "common", sub), exist_ok=True)
🤖 Prompt for AI Agents
In modules/model_structure.py around lines 14 to 18, the directory paths are
constructed using string concatenation with backslashes, which causes issues on
Linux and macOS. Replace these concatenations with os.path.join to build the
paths in a cross-platform way, ensuring the directories are created correctly
regardless of the operating system.

1.0
]
{
"preset category": "Flux",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Inconsistent "preset category" key breaks preset parsing

Other presets use the key "preset_category" (underscore). Having a space here will force every consumer that relies on the canonical key to special-case this file or simply fail to load it. Rename the key to keep the contract stable.

-  "preset category": "Flux",
+  "preset_category": "Flux",
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"preset category": "Flux",
"preset_category": "Flux",
🤖 Prompt for AI Agents
In presets/HyperFlux8.json at line 2, the key "preset category" uses a space
instead of an underscore, which breaks preset parsing. Rename the key to
"preset_category" to maintain consistency with other presets and ensure all
consumers can correctly parse this file without special casing.

Comment on lines +10 to +11
["None", 1.0],
["None", 1.0]
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

default_loras rows have inconsistent arity

Rows 4 & 5 only contain two elements, whereas the loader expects
[enabled:boolean, path:str, weight:number]. Omitting the boolean will shift the values and be parsed as (enabled='None', path=1.0, weight=<missing>), silently breaking LoRA loading.

-    ["None", 1.0],
-    ["None", 1.0]
+    [true, "None", 1.0],
+    [true, "None", 1.0]
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
["None", 1.0],
["None", 1.0]
[true, "None", 1.0],
[true, "None", 1.0]
🤖 Prompt for AI Agents
In presets/Juggernaut.json around lines 10 to 11, the entries in the
default_loras array have only two elements instead of the expected three:
enabled (boolean), path (string), and weight (number). To fix this, add the
missing boolean enabled value as the first element in each array entry, ensuring
the structure matches [enabled:boolean, path:string, weight:number] to prevent
incorrect parsing and loading errors.

Comment on lines +84 to 94
loras_paths = []
if isinstance(modules.config.paths_loras, list):
loras_paths.extend(modules.config.paths_loras)
else:
loras_paths.append(modules.config.paths_loras)
if isinstance(modules.config.paths_performance_loras, list):
loras_paths.extend(modules.config.paths_performance_loras)
else:
loras_paths.append(modules.config.paths_performance_loras)
lora_filename = get_file_from_folder_list(filename, loras_paths)

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Guard against None returns and remove duplicate search paths

  1. get_file_from_folder_list can return None; calling os.path.exists(None) raises TypeError.
  2. paths_loras & paths_performance_loras may contain None or overlap, yielding redundant work.
-                lora_filename = get_file_from_folder_list(filename, loras_paths)
-
-            if not os.path.exists(lora_filename):
+                lora_filename = get_file_from_folder_list(filename, loras_paths)
+
+            # Abort early if lookup failed
+            if not lora_filename or not os.path.exists(lora_filename):
                 continue

Optionally build loras_paths via a set() to deduplicate.

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In modules/core.py around lines 84 to 94, the code does not guard against None
values in paths_loras and paths_performance_loras, which can cause
get_file_from_folder_list to return None and lead to errors when checking file
existence. Also, the current approach may add duplicate or None paths, causing
redundant processing. Fix this by filtering out None values before adding paths,
use a set to collect unique paths from both config lists, then convert back to a
list before passing to get_file_from_folder_list.

Comment on lines +465 to +494
def move_lora(filepath: str, username: str) -> bool:
"""Takes a valid filepath as an input (supposed to be a LoRA file) and a username
and moves the file to the user's LoRA directory.

Returns True if the copy succeeded, False otherwise.
"""
import shutil
import os
from pathlib import Path

filepath = Path(filepath)
print(f"FILEPATH: {filepath} and USERNAME: {username} and PATHS_LORAS: {modules.config.paths_loras}")
destination_dir = Path(modules.config.paths_loras[0]) / username

print(f"User {username} requested to move {filepath}")

if filepath.is_file():
try:
if not destination_dir.exists():
# Prevent parent creation to be safe with folder creation
destination_dir.mkdir(parents=False, exist_ok=True)

shutil.copy(filepath, destination_dir.as_posix())
os.remove(filepath) # Deleting temp file
return True

except Exception as e:
print("Error while copying uploaded LoRA:", e)

return False No newline at end of file
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

⚠️ Potential issue

Harden move_lora – path-sanitisation, logging, and atomic move

  1. Security: username is concatenated straight into the path.
    move_lora("../../etc", "..") would happily escape the intended root.
    Strip separators and forbid .. components.

  2. Debug prints leak absolute paths & usernames; switch to logging.debug or drop them in production code.

  3. destination_dir.mkdir(parents=False) explodes if the parent folder is absent.
    Use parents=True or explicitly check.

  4. Instead of shutil.copy + os.remove, call shutil.move for atomic semantics and to preserve metadata.

-    destination_dir = Path(modules.config.paths_loras[0]) / username
+    safe_user = Path(username).name          # remove path traversal
+    destination_dir = Path(modules.config.paths_loras[0]) / safe_user

-    print(f"FILEPATH: {filepath} and USERNAME: {username} and PATHS_LORAS: {modules.config.paths_loras}")
-    ...
-            if not destination_dir.exists():
-                # Prevent parent creation to be safe with folder creation
-                destination_dir.mkdir(parents=False, exist_ok=True)  
+    # Avoid verbose printing in production; use logging if required
+    if not destination_dir.exists():
+        destination_dir.mkdir(parents=True, exist_ok=True)

-            shutil.copy(filepath, destination_dir.as_posix())
-            os.remove(filepath)  # Deleting temp file
+            shutil.move(filepath, destination_dir / filepath.name)

These tweaks tighten security and robustness.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def move_lora(filepath: str, username: str) -> bool:
"""Takes a valid filepath as an input (supposed to be a LoRA file) and a username
and moves the file to the user's LoRA directory.
Returns True if the copy succeeded, False otherwise.
"""
import shutil
import os
from pathlib import Path
filepath = Path(filepath)
print(f"FILEPATH: {filepath} and USERNAME: {username} and PATHS_LORAS: {modules.config.paths_loras}")
destination_dir = Path(modules.config.paths_loras[0]) / username
print(f"User {username} requested to move {filepath}")
if filepath.is_file():
try:
if not destination_dir.exists():
# Prevent parent creation to be safe with folder creation
destination_dir.mkdir(parents=False, exist_ok=True)
shutil.copy(filepath, destination_dir.as_posix())
os.remove(filepath) # Deleting temp file
return True
except Exception as e:
print("Error while copying uploaded LoRA:", e)
return False
def move_lora(filepath: str, username: str) -> bool:
"""Takes a valid filepath as an input (supposed to be a LoRA file) and a username
and moves the file to the user's LoRA directory.
Returns True if the copy succeeded, False otherwise.
"""
import shutil
import os
from pathlib import Path
filepath = Path(filepath)
safe_user = Path(username).name # remove path traversal
destination_dir = Path(modules.config.paths_loras[0]) / safe_user
print(f"User {username} requested to move {filepath}")
if filepath.is_file():
try:
# Avoid verbose printing in production; use logging if required
if not destination_dir.exists():
destination_dir.mkdir(parents=True, exist_ok=True)
shutil.move(filepath, destination_dir / filepath.name)
return True
except Exception as e:
print("Error while copying uploaded LoRA:", e)
return False
🤖 Prompt for AI Agents
In ldm_patched/modules/utils.py around lines 465 to 494, improve move_lora by
sanitizing the username to remove path separators and forbid '..' components to
prevent directory traversal. Replace print statements with logging.debug for
better logging practice. Change destination_dir.mkdir to use parents=True to
ensure parent directories are created without error. Finally, replace
shutil.copy followed by os.remove with shutil.move to perform an atomic move
operation that preserves metadata.

Comment on lines +881 to +888
def handle_lora_upload(file, request: gr.Request):
if file is None:
return
success = move_lora(file.name, request.username if hasattr(request, 'username') else "default")
if success:
print(f"Successfully moved LoRA file: {file.name}")
else:
print(f"Failed to move LoRA file: {file.name}")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Pass the real file path to move_lora, not just the basename

file.name is only the original filename; after Gradio upload the temporary file is stored elsewhere.
If move_lora() expects an existing path, passing the basename will raise “file not found” or silently skip the move.

-    success = move_lora(file.name, request.username if hasattr(request, 'username') else "default")
+    # `file` is already a `Path`/str pointing to the temp file on disk.
+    success = move_lora(str(file), request.username if hasattr(request, 'username') else "default")
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def handle_lora_upload(file, request: gr.Request):
if file is None:
return
success = move_lora(file.name, request.username if hasattr(request, 'username') else "default")
if success:
print(f"Successfully moved LoRA file: {file.name}")
else:
print(f"Failed to move LoRA file: {file.name}")
def handle_lora_upload(file, request: gr.Request):
if file is None:
return
# `file` is already a `Path`/str pointing to the temp file on disk.
success = move_lora(str(file), request.username if hasattr(request, 'username') else "default")
if success:
print(f"Successfully moved LoRA file: {file.name}")
else:
print(f"Failed to move LoRA file: {file.name}")
🤖 Prompt for AI Agents
In webui.py around lines 881 to 888, the function handle_lora_upload passes only
the basename file.name to move_lora, but move_lora requires the actual file path
of the uploaded file. Update the call to move_lora to pass the full temporary
file path of the uploaded file instead of just file.name, ensuring the file
exists and can be moved correctly.

Comment on lines +247 to 248
paths_performance_loras = [f'{path_models_root}/loras/performance/', '../models/loras/performance/']
path_embeddings = get_dir_or_set_default('path_embeddings', f'{path_models_root}/embeddings/')
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

paths_performance_loras directories are never created

paths_performance_loras is assigned a raw list; unlike every other path list, it bypasses get_dir_or_set_default, so the folder may not exist. Recommend:

-paths_performance_loras = [f'{path_models_root}/loras/performance/', '../models/loras/performance/']
+paths_performance_loras = get_dir_or_set_default(
+    'path_performance_loras',
+    [f'{path_models_root}/loras/performance/', '../models/loras/performance/'],
+    as_array=True,
+    make_directory=True
+)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
paths_performance_loras = [f'{path_models_root}/loras/performance/', '../models/loras/performance/']
path_embeddings = get_dir_or_set_default('path_embeddings', f'{path_models_root}/embeddings/')
paths_performance_loras = get_dir_or_set_default(
'path_performance_loras',
[f'{path_models_root}/loras/performance/', '../models/loras/performance/'],
as_array=True,
make_directory=True
)
path_embeddings = get_dir_or_set_default('path_embeddings', f'{path_models_root}/embeddings/')
🤖 Prompt for AI Agents
In modules/config.py around lines 247 to 248, the variable
paths_performance_loras is assigned a list of directory paths directly without
ensuring these directories exist. To fix this, replace the raw list assignment
with calls to get_dir_or_set_default for each path to guarantee the directories
are created or set to defaults, consistent with other path variables.

Comment on lines +166 to +184
def get_loras_path() -> str:
global config_dict
path_loras = [f'{path_models_root}/loras/', '../models/loras/']
if args_manager.args.path_loras:
if isinstance(args_manager.args.path_loras, list):
path_loras = args_manager.args.path_loras
else:
path_loras = [args_manager.args.path_loras, args_manager.args.path_loras]
config_dict['path_loras'] = args_manager.args.path_loras
elif 'path_loras' in config_dict and config_dict['path_loras']:
if isinstance(config_dict['path_loras'], list):
path_loras = config_dict['path_loras']
else:
path_loras = [config_dict['path_loras'], config_dict['path_loras']]

path_loras = get_dir_or_set_default('path_loras', path_loras, True)

print(f'Loras will be stored in path: {path_loras}')
return path_loras
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

⚠️ Potential issue

Fix type-hint, ensure directory creation & align with new “common” sub-folder

Several issues in get_loras_path() can bite later:

  1. The function returns a list[str], not a single str.
  2. Default paths should already include the common/ sub-directory introduced elsewhere – right now they point one level too high.
  3. get_dir_or_set_default() is only invoked in the elif branch, so when the CLI flag is used or when defaults are taken, the folders are never validated / created.
  4. Because of (3) you may hit “file not found” errors when users try first-time uploads.

A compact patch:

-def get_loras_path() -> str:
+def get_loras_path() -> list[str]:
@@
-    path_loras = [f'{path_models_root}/loras/', '../models/loras/']
+    # Default to the new 'common' namespace
+    path_loras = [f'{path_models_root}/loras/common/', '../models/loras/common/']
@@
-        config_dict['path_loras'] = args_manager.args.path_loras
+        config_dict['path_loras'] = args_manager.args.path_loras
@@
-    elif 'path_loras' in config_dict and config_dict['path_loras']:
+    elif 'path_loras' in config_dict and config_dict['path_loras']:
@@
-    
-        path_loras = get_dir_or_set_default('path_loras', path_loras, True)
-    
-    print(f'Loras will be stored in path: {path_loras}')
-    return path_loras
+    # Always validate / create directories
+    path_loras = get_dir_or_set_default('path_loras', path_loras, True, make_directory=True)
+    print(f'Loras will be stored in path: {path_loras}')
+    return path_loras
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def get_loras_path() -> str:
global config_dict
path_loras = [f'{path_models_root}/loras/', '../models/loras/']
if args_manager.args.path_loras:
if isinstance(args_manager.args.path_loras, list):
path_loras = args_manager.args.path_loras
else:
path_loras = [args_manager.args.path_loras, args_manager.args.path_loras]
config_dict['path_loras'] = args_manager.args.path_loras
elif 'path_loras' in config_dict and config_dict['path_loras']:
if isinstance(config_dict['path_loras'], list):
path_loras = config_dict['path_loras']
else:
path_loras = [config_dict['path_loras'], config_dict['path_loras']]
path_loras = get_dir_or_set_default('path_loras', path_loras, True)
print(f'Loras will be stored in path: {path_loras}')
return path_loras
def get_loras_path() -> list[str]:
global config_dict
# Default to the new 'common' namespace
path_loras = [f'{path_models_root}/loras/common/', '../models/loras/common/']
if args_manager.args.path_loras:
if isinstance(args_manager.args.path_loras, list):
path_loras = args_manager.args.path_loras
else:
path_loras = [args_manager.args.path_loras, args_manager.args.path_loras]
config_dict['path_loras'] = args_manager.args.path_loras
elif 'path_loras' in config_dict and config_dict['path_loras']:
if isinstance(config_dict['path_loras'], list):
path_loras = config_dict['path_loras']
else:
path_loras = [config_dict['path_loras'], config_dict['path_loras']]
# Always validate / create directories
path_loras = get_dir_or_set_default('path_loras', path_loras, True, make_directory=True)
print(f'Loras will be stored in path: {path_loras}')
return path_loras
🤖 Prompt for AI Agents
In modules/config.py around lines 166 to 184, the get_loras_path function has an
incorrect return type hint (should be list of strings, not a single string),
default paths missing the "common/" sub-folder, and directory creation is only
triggered in the elif branch. Fix by changing the return type hint to list[str],
update default paths to include the "common/" sub-folder, and ensure
get_dir_or_set_default is called regardless of whether the path comes from CLI
args, config, or defaults to validate and create directories as needed.

Comment on lines 1075 to 1100
def downloading_sdxl_lcm_lora():
load_file_from_url(
url='https://huggingface.co/lllyasviel/misc/resolve/main/sdxl_lcm_lora.safetensors',
model_dir=paths_loras[0],
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.EXTREME_SPEED.value
)
return modules.flags.PerformanceLoRA.EXTREME_SPEED.value
return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}"


def downloading_sdxl_lightning_lora():
load_file_from_url(
url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_lightning_4step_lora.safetensors',
model_dir=paths_loras[0],
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.LIGHTNING.value
)
return modules.flags.PerformanceLoRA.LIGHTNING.value
return f"{modules.flags.PerformanceLoRA.LIGHTNING.value}"


def downloading_sdxl_hyper_sd_lora():
load_file_from_url(
url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_hyper_sd_4step_lora.safetensors',
model_dir=paths_loras[0],
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.HYPER_SD.value
)
return modules.flags.PerformanceLoRA.HYPER_SD.value
return f"{modules.flags.PerformanceLoRA.HYPER_SD.value}"

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Download helpers now return just the file-name, breaking previous callers

downloading_sdxl_*_lora() used to return the full path; after the change it returns only the bare filename:

return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}"

Down-stream code expecting os.path.join(dir, file) will now fail to locate the file.

Either restore the old behaviour:

-    return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}"
+    return os.path.join(paths_performance_loras[0], modules.flags.PerformanceLoRA.EXTREME_SPEED.value)

and similarly for the other two helpers, or audit every call-site.

Also ensure the parent directory exists (issue noted in a previous comment).


🏁 Script executed:

#!/bin/bash
set -e

echo "🔍 Searching for call‐sites of downloading_sdxl_*_lora()..."
rg -n "downloading_sdxl_(lcm|lightning|hyper)_lora" .

echo
echo "🔍 Locating load_file_from_url definition to inspect its return value..."
rg -n "def load_file_from_url" -A10 .

Length of output: 4600


Restore full file path in download helpers

The downloading_sdxl_*_lora() functions currently discard the path returned by load_file_from_url and only return the bare filename, but downstream code (e.g. in modules/async_worker.py) expects the full file path. Update each helper to return the result of load_file_from_url and ensure its target directory exists:

• File: modules/config.py (around lines 1075–1100)

def downloading_sdxl_lcm_lora():
-   load_file_from_url(
+   return load_file_from_url(
       url='https://huggingface.co/lllyasviel/misc/resolve/main/sdxl_lcm_lora.safetensors',
       model_dir=paths_performance_loras[0],
       file_name=modules.flags.PerformanceLoRA.EXTREME_SPEED.value
   )
-   return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}"

Apply the same pattern to downloading_sdxl_lightning_lora() and downloading_sdxl_hyper_sd_lora().
Also, before calling load_file_from_url, ensure paths_performance_loras[0] exists (e.g., os.makedirs(..., exist_ok=True)).

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def downloading_sdxl_lcm_lora():
load_file_from_url(
url='https://huggingface.co/lllyasviel/misc/resolve/main/sdxl_lcm_lora.safetensors',
model_dir=paths_loras[0],
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.EXTREME_SPEED.value
)
return modules.flags.PerformanceLoRA.EXTREME_SPEED.value
return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}"
def downloading_sdxl_lightning_lora():
load_file_from_url(
url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_lightning_4step_lora.safetensors',
model_dir=paths_loras[0],
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.LIGHTNING.value
)
return modules.flags.PerformanceLoRA.LIGHTNING.value
return f"{modules.flags.PerformanceLoRA.LIGHTNING.value}"
def downloading_sdxl_hyper_sd_lora():
load_file_from_url(
url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_hyper_sd_4step_lora.safetensors',
model_dir=paths_loras[0],
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.HYPER_SD.value
)
return modules.flags.PerformanceLoRA.HYPER_SD.value
return f"{modules.flags.PerformanceLoRA.HYPER_SD.value}"
def downloading_sdxl_lcm_lora():
return load_file_from_url(
url='https://huggingface.co/lllyasviel/misc/resolve/main/sdxl_lcm_lora.safetensors',
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.EXTREME_SPEED.value
)
def downloading_sdxl_lightning_lora():
return load_file_from_url(
url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_lightning_4step_lora.safetensors',
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.LIGHTNING.value
)
def downloading_sdxl_hyper_sd_lora():
return load_file_from_url(
url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_hyper_sd_4step_lora.safetensors',
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.HYPER_SD.value
)
🤖 Prompt for AI Agents
In modules/config.py around lines 1075 to 1100, the downloading_sdxl_*_lora
functions currently return only the filename string instead of the full file
path, which breaks downstream code expecting the full path. To fix this, modify
each function to first ensure the directory paths_performance_loras[0] exists by
creating it if necessary, then call load_file_from_url and return its full path
result instead of just the filename. Apply this fix consistently to
downloading_sdxl_lcm_lora, downloading_sdxl_lightning_lora, and
downloading_sdxl_hyper_sd_lora.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments