Conversation
WalkthroughThe changes introduce user-specific LoRA file management and upload functionality. LoRA paths are restructured to include "common" and per-user subdirectories, with configuration and model selection logic updated accordingly. The web UI now allows authenticated users to upload LoRA files, which are stored in user-specific folders, and the UI dynamically refreshes to reflect uploaded files. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant WebUI
participant move_lora (utils)
participant Config/ModelInfo
User->>WebUI: Upload LoRA file (.safetensors/.ckpt)
WebUI->>move_lora: move_lora(filepath, username)
move_lora->>Config/ModelInfo: Store file in user-specific LoRA directory
move_lora-->>WebUI: Success/Failure
WebUI->>Config/ModelInfo: Refresh LoRA/model dropdowns with user context
Config/ModelInfo-->>WebUI: Updated file lists (including user and "common" LoRAs)
WebUI-->>User: Updated UI reflecting new LoRA file
Assessment against linked issues
Assessment against linked issues: Out-of-scope changes
Poem
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Actionable comments posted: 10
🧹 Nitpick comments (14)
ldm_patched/modules/args_parser.py (1)
43-43: Add help text for new--path-lorasargument
The--path-lorasoption currently lacks ahelpdescription andmetavar, which impacts CLI discoverability. Please include a descriptive help string (e.g.,help="Custom LoRA models base path") and consider adding ametavarfor clarity.presets/4GB_Default.json (1)
8-9: Back-slash in JSON paths couples presets to Windows paths
"common\\sd_xl_offset_example-lora_1.0.safetensors"works on Windows but is a literal
back-slash on POSIX. All runtime code usesos.path.join, so keep JSON portable by using a
forward slash (/) or no separator at all (and let the loader prepend"common").Example:
[ true, "common/sd_xl_offset_example-lora_1.0.safetensors", 0.5 ]Apply the same convention to every preset touched in this PR.
presets/Flux1D_GGUF.json (1)
26-29: Platform-neutral separator recommendedSame concern as above:
"common\\Flux\\Hyper-FLUX.1-dev-8steps-lora.safetensors"embeds Windows
separators. Prefer/or let the runtime build the path.presets/RealVisXL.json (1)
6-9: Use POSIX separators or leave path building to codeAll three modified LoRA paths contain
\\. This will create files with back-slashes in their
names on Linux. Align with the cross-platform advice given for the other presets.presets/HyperFlux8.json (1)
26-32: Hard-coded back-slashes reduce portabilityThe default-LoRA paths embed Windows back-slashes; on Linux/macOS these are just literal characters and path resolution may fail. Consider storing paths with forward-slashes or resolving them with
os.path.joinat runtime instead of baking a platform-specific separator into the JSON.presets/Juggernaut.json (1)
7-7: Same portability note about back-slashesSee previous comment – favour forward-slashes to keep presets platform-neutral.
presets/Flux1S_GGUF.json (1)
27-28: Back-slash paths hinder cross-platform usageSame remark as above – prefer forward-slashes or build paths at runtime.
presets/Flux1D_8GGUF.json (1)
27-28: Back-slash paths hinder cross-platform usageSee earlier comment; unify path separators for portability.
presets/HyperFlux.json (1)
26-32: Prefer POSIX-style forward slashes for cross-platform pathsThe new LoRA entries use back-slash separators (
"common\\Flux\\...").
Most of the Python code later treats model paths as raw strings and then feeds them toPath/os.path, which works on Windows but breaks on POSIX when the JSON value is re-used verbatim (e.g. in HTTP APIs).- [false, "common\\Flux\\FluxDFaeTasticDetails.safetensors", 0.8], - [false, "common\\Flux\\Flux_BlackColor_SaMay.safetensors", 0.7], + [false, "common/Flux/FluxDFaeTasticDetails.safetensors", 0.8], + [false, "common/Flux/Flux_BlackColor_SaMay.safetensors", 0.7],Using
/keeps the preset portable for every runtime.simpleai_base/models_info.py (1)
689-696: Clarify filter condition & adopt membership checkThe current expression relies on operator precedence and is hard to read:
if model_type != "common" and model_type != user or model_type == "performance":Parenthesise and use
not infor clarity (also silences the Ruff/Pylint hints):- if model_type != "common" and model_type != user or model_type == "performance": + if model_type == "performance" or model_type not in {"common", user}: continueBehaviour is unchanged but intent is explicit.
🧰 Tools
🪛 Pylint (3.3.7)
[refactor] 692-692: Consider merging these comparisons with 'in' by using 'model_type not in ('common', user)'. Use a set instead if elements are hashable.
(R1714)
webui.py (3)
890-904: Refresh logic duplicated – extract a helper
refresh_files_clickedandrefresh_after_lora_uploadrepeat almost identical blocks (model/LoRA list rebuild).
Move the shared code into a single utility to avoid future drift and ease maintenance.🧰 Tools
🪛 Ruff (0.11.9)
899-899: Loop control variable
inot used within loop bodyRename unused
ito_i(B007)
870-873: Loop index unused
for i in range(...):–iis never referenced; rename to_or drop the variable to silence linters and clarify intent.🧰 Tools
🪛 Ruff (0.11.9)
870-870: Loop control variable
inot used within loop bodyRename unused
ito_i(B007)
898-901: Same unused loop variable as aboveApply the same rename here for consistency.
🧰 Tools
🪛 Ruff (0.11.9)
899-899: Loop control variable
inot used within loop bodyRename unused
ito_i(B007)
modules/config.py (1)
1026-1031:update_files()signature/return mismatch & silent parameter propagationGood call exposing a
userparameter, but:
- The docstring (missing) and callers need updating; otherwise devs won’t know the additional filter exists.
- The function still returns only three lists even though it populates
wildcard_filenames—makes the global harder to reason about. Either return it or leave a comment why it’s global-only.Minor, yet worth tightening for maintainability.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (36)
launch.py(1 hunks)ldm_patched/modules/args_parser.py(1 hunks)ldm_patched/modules/utils.py(2 hunks)modules/auth.py(0 hunks)modules/config.py(4 hunks)modules/core.py(1 hunks)modules/model_structure.py(1 hunks)presets/4GB_Default.json(1 hunks)presets/4GB_Pony.json(1 hunks)presets/AnimaPencil.json(1 hunks)presets/Cheyenne.json(1 hunks)presets/Flux1D.json(1 hunks)presets/Flux1D_8GGUF.json(1 hunks)presets/Flux1D_GGUF.json(1 hunks)presets/Flux1S.json(1 hunks)presets/Flux1S_GGUF.json(1 hunks)presets/HyDiT.json(1 hunks)presets/HyperFlux.json(1 hunks)presets/HyperFlux16.json(1 hunks)presets/HyperFlux8.json(1 hunks)presets/Juggernaut.json(1 hunks)presets/Kolors.json(1 hunks)presets/KolorsFast.json(1 hunks)presets/KolorsTile.json(1 hunks)presets/Magick.json(1 hunks)presets/Playground.json(1 hunks)presets/Pony.json(1 hunks)presets/PonyFaeTality.json(1 hunks)presets/RealVisXL.json(1 hunks)presets/RealisticPhoto.json(1 hunks)presets/SD1.5_RealVis.json(1 hunks)presets/SDXL.json(1 hunks)presets/Tile.json(1 hunks)presets/default.json(1 hunks)simpleai_base/models_info.py(2 hunks)webui.py(3 hunks)
💤 Files with no reviewable changes (1)
- modules/auth.py
🧰 Additional context used
🧬 Code Graph Analysis (1)
launch.py (1)
modules/util.py (1)
get_file_from_folder_list(402-411)
🪛 Ruff (0.11.9)
simpleai_base/models_info.py
685-685: Use key in dict instead of key in dict.keys()
Remove .keys()
(SIM118)
webui.py
870-870: Loop control variable i not used within loop body
Rename unused i to _i
(B007)
899-899: Loop control variable i not used within loop body
Rename unused i to _i
(B007)
🪛 Pylint (3.3.7)
simpleai_base/models_info.py
[refactor] 680-680: Too many arguments (6/5)
(R0913)
[refactor] 680-680: Too many positional arguments (6/5)
(R0917)
[refactor] 692-692: Consider merging these comparisons with 'in' by using 'model_type not in ('common', user)'. Use a set instead if elements are hashable.
(R1714)
[refactor] 680-680: Too many branches (14/12)
(R0912)
🔇 Additional comments (20)
presets/Pony.json (1)
2-33: Formatting-only change: The JSON was compacted and re-indented for consistency without any semantic or path modifications. All keys, values, and download URLs remain unchanged.presets/Tile.json (1)
2-54: Stylistic JSON reformat only: The file was restructured to a more compact, single-line format for objects and arrays. No functional or path changes were introduced.presets/AnimaPencil.json (1)
2-41: Stylistic JSON reformat only: All LoRA, model, and download entries remain identical; only whitespace and line breaks were adjusted.presets/PonyFaeTality.json (1)
2-31: Stylistic JSON reformat only: Formatting was condensed without altering any data or default settings.presets/Cheyenne.json (1)
2-32: Reformat-only changes, no semantic modifications.
These updates only adjust whitespace and indentation; the data remains unchanged.presets/default.json (1)
7-8: LoRA path prefix updated correctly.
The two entries now include thecommon\prefix to match the new directory structure and updated loading logic.presets/SD1.5_RealVis.json (1)
2-38: Reformat-only changes, no semantic modifications.
These updates only adjust whitespace and indentation; the data remains unchanged.presets/4GB_Pony.json (1)
2-30: Reformat-only changes, no semantic modifications.
These updates only adjust whitespace and indentation; the data remains unchanged.presets/HyDiT.json (1)
1-40: Reformat-only changes, no semantic modifications.
These updates only adjust whitespace and indentation; the data remains unchanged.presets/KolorsTile.json (1)
2-45: Formatting-only changes
The JSON has been reformatted for compactness and consistent 2-space indentation. No keys, values, or semantic content were altered.presets/Magick.json (1)
7-8: Validate LoRA path prefix & cross-platform handling
The first two entries indefault_lorasnow correctly include the"common\\"prefix. Ensure your loader normalizes these backslashes on non-Windows systems or consider using forward slashes for portability.presets/Kolors.json (1)
2-28: Formatting-only changes
Whitespace and line breaks were condensed; all keys and values remain unchanged.presets/Flux1S.json (1)
2-50: Formatting-only changes
Reformatted arrays/objects into more compact inline form without modifying any semantic data.presets/Playground.json (1)
7-8: Validate LoRA path prefix & cross-platform handling
The first twodefault_loraspaths have been updated to include"common\\". Confirm the application handles these backslashes on POSIX or switch to/separators for consistency.presets/KolorsFast.json (1)
23-23: Verify the "common" prefix and path separators in LoRA paths.
The LoRA reference was updated to"common\\Hyper-SDXL-8steps-lora.safetensors". Ensure that the JSON parser and file-loading logic handle Windows-style backslashes correctly across all platforms, or consider using forward slashes ("common/Hyper-SDXL-8steps-lora.safetensors") for platform-agnostic paths.presets/Flux1D.json (1)
1-51: Approve formatting-only changes.
This file only underwent whitespace and indentation adjustments without any semantic modifications.presets/SDXL.json (1)
6-12: Validate updated LoRA path prefix.
The default LoRA entry now points to"common\\sd_xl_offset_example-lora_1.0.safetensors". Confirm that the loader supports this "common" subdirectory and handles backslashes correctly, or switch to forward slashes for consistency.presets/RealisticPhoto.json (1)
6-12: Check consistency of LoRA path prefix.
The first LoRA model path was changed to"common\\SDXL_FILM_PHOTOGRAPHY_STYLE_V1.safetensors". Verify that this matches the directory structure created bymodules/model_structure.pyand is accessible at runtime.presets/HyperFlux16.json (1)
26-27: Confirm new "common" directory for multiple LoRA entries.
Entries for"common\\Flux\\FluxDFaeTasticDetails.safetensors"and"common\\Flux\\AntiBlur.safetensors"were prefixed accordingly. Ensure that nested subdirectories under"common"are created and the search logic inmodules/core.pycovers these paths.simpleai_base/models_info.py (1)
680-683: Good fix – replaced mutable default withNoneSwitching the default value of
filtersfrom[]toNoneprevents the classic mutable-default foot-gun.
No action needed.🧰 Tools
🪛 Pylint (3.3.7)
[refactor] 680-680: Too many arguments (6/5)
(R0913)
[refactor] 680-680: Too many positional arguments (6/5)
(R0917)
[refactor] 680-680: Too many branches (14/12)
(R0912)
| common_paths = [os.path.join(path, 'common') for path in config.paths_loras] | ||
| model_dir = os.path.dirname(get_file_from_folder_list(file_name, common_paths)) | ||
| load_file_from_url(url=url, model_dir=model_dir, file_name=file_name) |
There was a problem hiding this comment.
💡 Verification agent
❓ Verification inconclusive
Hard-coding the "common" sub-folder scatters path logic and risks duplication
The same "common" suffix is now appended here, in modules/model_structure.create_model_structure(), and (implicitly) in several preset JSON files. When the convention changes again (e.g. per-user folders, a different tier like "public"), every call-site must be audited.
- common_paths = [os.path.join(path, 'common') for path in config.paths_loras]
+ # keep the “where are my LoRAs?” rule in one place
+ common_paths = config.get_common_lora_paths()Consider adding a helper such as config.get_common_lora_paths() (or re-using one if it already exists) so the rule lives in exactly one module.
Extract “common” path logic into a single helper
To avoid scattering the "common" suffix in multiple places, centralize it in one method on config.
• In launch.py (lines 147–149) and in modules/model_structure.py you’re appending "common" to each entry in config.paths_loras.
• Preset JSON files also hard‐code the "common" folder.
Proposed changes:
- Add to your config class (e.g. in config.py):
def get_common_lora_paths(self) -> list[str]:
"""Return each LORA root path with the 'common' subfolder appended."""
return [os.path.join(path, "common") for path in self.paths_loras]- Replace both call‐sites:
- common_paths = [os.path.join(path, 'common') for path in config.paths_loras]
+ common_paths = config.get_common_lora_paths()This keeps the convention in one place and prevents divergence when folder tiers change.
🤖 Prompt for AI Agents
In launch.py around lines 147 to 149, the "common" subfolder is hard-coded when
constructing paths, which duplicates logic also found in
modules/model_structure.py and preset JSON files. To fix this, add a helper
method get_common_lora_paths() to the config class that returns paths_loras
entries with "common" appended, then replace the list comprehension in launch.py
and the similar code in model_structure.py with calls to this new helper. This
centralizes the "common" path logic and prevents inconsistencies.
| os.makedirs(config.paths_loras[0] + '\common\Alternative', exist_ok=True) | ||
| os.makedirs(config.paths_loras[0] + '\common\Flux', exist_ok=True) | ||
| os.makedirs(config.paths_loras[0] + '\common\Pony', exist_ok=True) | ||
| os.makedirs(config.paths_loras[0] + '\common\SD1.5', exist_ok=True) | ||
| os.makedirs(config.paths_loras[0] + '\common\SD3x', exist_ok=True) |
There was a problem hiding this comment.
Windows-only path separators will break on Linux / macOS
os.makedirs(config.paths_loras[0] + '\common\Alternative', …) embeds back-slashes.
On POSIX these are literal characters, so the code creates a directory literally named
“…/loras\common\Alternative”, and later look-ups using os.path.join(base, "common", "Alternative")
will not find it.
-os.makedirs(config.paths_loras[0] + '\common\Alternative', exist_ok=True)
-os.makedirs(config.paths_loras[0] + '\common\Flux', exist_ok=True)
-…
+base = config.paths_loras[0]
+for sub in ("Alternative", "Flux", "Pony", "SD1.5", "SD3x"):
+ os.makedirs(os.path.join(base, "common", sub), exist_ok=True)Please replace the string concatenation with os.path.join (or pathlib) for full cross-platform support.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| os.makedirs(config.paths_loras[0] + '\common\Alternative', exist_ok=True) | |
| os.makedirs(config.paths_loras[0] + '\common\Flux', exist_ok=True) | |
| os.makedirs(config.paths_loras[0] + '\common\Pony', exist_ok=True) | |
| os.makedirs(config.paths_loras[0] + '\common\SD1.5', exist_ok=True) | |
| os.makedirs(config.paths_loras[0] + '\common\SD3x', exist_ok=True) | |
| base = config.paths_loras[0] | |
| for sub in ("Alternative", "Flux", "Pony", "SD1.5", "SD3x"): | |
| os.makedirs(os.path.join(base, "common", sub), exist_ok=True) |
🤖 Prompt for AI Agents
In modules/model_structure.py around lines 14 to 18, the directory paths are
constructed using string concatenation with backslashes, which causes issues on
Linux and macOS. Replace these concatenations with os.path.join to build the
paths in a cross-platform way, ensuring the directories are created correctly
regardless of the operating system.
| 1.0 | ||
| ] | ||
| { | ||
| "preset category": "Flux", |
There was a problem hiding this comment.
Inconsistent "preset category" key breaks preset parsing
Other presets use the key "preset_category" (underscore). Having a space here will force every consumer that relies on the canonical key to special-case this file or simply fail to load it. Rename the key to keep the contract stable.
- "preset category": "Flux",
+ "preset_category": "Flux",📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "preset category": "Flux", | |
| "preset_category": "Flux", |
🤖 Prompt for AI Agents
In presets/HyperFlux8.json at line 2, the key "preset category" uses a space
instead of an underscore, which breaks preset parsing. Rename the key to
"preset_category" to maintain consistency with other presets and ensure all
consumers can correctly parse this file without special casing.
| ["None", 1.0], | ||
| ["None", 1.0] |
There was a problem hiding this comment.
default_loras rows have inconsistent arity
Rows 4 & 5 only contain two elements, whereas the loader expects
[enabled:boolean, path:str, weight:number]. Omitting the boolean will shift the values and be parsed as (enabled='None', path=1.0, weight=<missing>), silently breaking LoRA loading.
- ["None", 1.0],
- ["None", 1.0]
+ [true, "None", 1.0],
+ [true, "None", 1.0]📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ["None", 1.0], | |
| ["None", 1.0] | |
| [true, "None", 1.0], | |
| [true, "None", 1.0] |
🤖 Prompt for AI Agents
In presets/Juggernaut.json around lines 10 to 11, the entries in the
default_loras array have only two elements instead of the expected three:
enabled (boolean), path (string), and weight (number). To fix this, add the
missing boolean enabled value as the first element in each array entry, ensuring
the structure matches [enabled:boolean, path:string, weight:number] to prevent
incorrect parsing and loading errors.
| loras_paths = [] | ||
| if isinstance(modules.config.paths_loras, list): | ||
| loras_paths.extend(modules.config.paths_loras) | ||
| else: | ||
| loras_paths.append(modules.config.paths_loras) | ||
| if isinstance(modules.config.paths_performance_loras, list): | ||
| loras_paths.extend(modules.config.paths_performance_loras) | ||
| else: | ||
| loras_paths.append(modules.config.paths_performance_loras) | ||
| lora_filename = get_file_from_folder_list(filename, loras_paths) | ||
|
|
There was a problem hiding this comment.
Guard against None returns and remove duplicate search paths
get_file_from_folder_listcan returnNone; callingos.path.exists(None)raisesTypeError.paths_loras&paths_performance_lorasmay containNoneor overlap, yielding redundant work.
- lora_filename = get_file_from_folder_list(filename, loras_paths)
-
- if not os.path.exists(lora_filename):
+ lora_filename = get_file_from_folder_list(filename, loras_paths)
+
+ # Abort early if lookup failed
+ if not lora_filename or not os.path.exists(lora_filename):
continueOptionally build loras_paths via a set() to deduplicate.
Committable suggestion skipped: line range outside the PR's diff.
🤖 Prompt for AI Agents
In modules/core.py around lines 84 to 94, the code does not guard against None
values in paths_loras and paths_performance_loras, which can cause
get_file_from_folder_list to return None and lead to errors when checking file
existence. Also, the current approach may add duplicate or None paths, causing
redundant processing. Fix this by filtering out None values before adding paths,
use a set to collect unique paths from both config lists, then convert back to a
list before passing to get_file_from_folder_list.
| def move_lora(filepath: str, username: str) -> bool: | ||
| """Takes a valid filepath as an input (supposed to be a LoRA file) and a username | ||
| and moves the file to the user's LoRA directory. | ||
|
|
||
| Returns True if the copy succeeded, False otherwise. | ||
| """ | ||
| import shutil | ||
| import os | ||
| from pathlib import Path | ||
|
|
||
| filepath = Path(filepath) | ||
| print(f"FILEPATH: {filepath} and USERNAME: {username} and PATHS_LORAS: {modules.config.paths_loras}") | ||
| destination_dir = Path(modules.config.paths_loras[0]) / username | ||
|
|
||
| print(f"User {username} requested to move {filepath}") | ||
|
|
||
| if filepath.is_file(): | ||
| try: | ||
| if not destination_dir.exists(): | ||
| # Prevent parent creation to be safe with folder creation | ||
| destination_dir.mkdir(parents=False, exist_ok=True) | ||
|
|
||
| shutil.copy(filepath, destination_dir.as_posix()) | ||
| os.remove(filepath) # Deleting temp file | ||
| return True | ||
|
|
||
| except Exception as e: | ||
| print("Error while copying uploaded LoRA:", e) | ||
|
|
||
| return False No newline at end of file |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Harden move_lora – path-sanitisation, logging, and atomic move
-
Security:
usernameis concatenated straight into the path.
move_lora("../../etc", "..")would happily escape the intended root.
Strip separators and forbid..components. -
Debug prints leak absolute paths & usernames; switch to
logging.debugor drop them in production code. -
destination_dir.mkdir(parents=False)explodes if the parent folder is absent.
Useparents=Trueor explicitly check. -
Instead of
shutil.copy+os.remove, callshutil.movefor atomic semantics and to preserve metadata.
- destination_dir = Path(modules.config.paths_loras[0]) / username
+ safe_user = Path(username).name # remove path traversal
+ destination_dir = Path(modules.config.paths_loras[0]) / safe_user
- print(f"FILEPATH: {filepath} and USERNAME: {username} and PATHS_LORAS: {modules.config.paths_loras}")
- ...
- if not destination_dir.exists():
- # Prevent parent creation to be safe with folder creation
- destination_dir.mkdir(parents=False, exist_ok=True)
+ # Avoid verbose printing in production; use logging if required
+ if not destination_dir.exists():
+ destination_dir.mkdir(parents=True, exist_ok=True)
- shutil.copy(filepath, destination_dir.as_posix())
- os.remove(filepath) # Deleting temp file
+ shutil.move(filepath, destination_dir / filepath.name)These tweaks tighten security and robustness.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def move_lora(filepath: str, username: str) -> bool: | |
| """Takes a valid filepath as an input (supposed to be a LoRA file) and a username | |
| and moves the file to the user's LoRA directory. | |
| Returns True if the copy succeeded, False otherwise. | |
| """ | |
| import shutil | |
| import os | |
| from pathlib import Path | |
| filepath = Path(filepath) | |
| print(f"FILEPATH: {filepath} and USERNAME: {username} and PATHS_LORAS: {modules.config.paths_loras}") | |
| destination_dir = Path(modules.config.paths_loras[0]) / username | |
| print(f"User {username} requested to move {filepath}") | |
| if filepath.is_file(): | |
| try: | |
| if not destination_dir.exists(): | |
| # Prevent parent creation to be safe with folder creation | |
| destination_dir.mkdir(parents=False, exist_ok=True) | |
| shutil.copy(filepath, destination_dir.as_posix()) | |
| os.remove(filepath) # Deleting temp file | |
| return True | |
| except Exception as e: | |
| print("Error while copying uploaded LoRA:", e) | |
| return False | |
| def move_lora(filepath: str, username: str) -> bool: | |
| """Takes a valid filepath as an input (supposed to be a LoRA file) and a username | |
| and moves the file to the user's LoRA directory. | |
| Returns True if the copy succeeded, False otherwise. | |
| """ | |
| import shutil | |
| import os | |
| from pathlib import Path | |
| filepath = Path(filepath) | |
| safe_user = Path(username).name # remove path traversal | |
| destination_dir = Path(modules.config.paths_loras[0]) / safe_user | |
| print(f"User {username} requested to move {filepath}") | |
| if filepath.is_file(): | |
| try: | |
| # Avoid verbose printing in production; use logging if required | |
| if not destination_dir.exists(): | |
| destination_dir.mkdir(parents=True, exist_ok=True) | |
| shutil.move(filepath, destination_dir / filepath.name) | |
| return True | |
| except Exception as e: | |
| print("Error while copying uploaded LoRA:", e) | |
| return False |
🤖 Prompt for AI Agents
In ldm_patched/modules/utils.py around lines 465 to 494, improve move_lora by
sanitizing the username to remove path separators and forbid '..' components to
prevent directory traversal. Replace print statements with logging.debug for
better logging practice. Change destination_dir.mkdir to use parents=True to
ensure parent directories are created without error. Finally, replace
shutil.copy followed by os.remove with shutil.move to perform an atomic move
operation that preserves metadata.
| def handle_lora_upload(file, request: gr.Request): | ||
| if file is None: | ||
| return | ||
| success = move_lora(file.name, request.username if hasattr(request, 'username') else "default") | ||
| if success: | ||
| print(f"Successfully moved LoRA file: {file.name}") | ||
| else: | ||
| print(f"Failed to move LoRA file: {file.name}") |
There was a problem hiding this comment.
Pass the real file path to move_lora, not just the basename
file.name is only the original filename; after Gradio upload the temporary file is stored elsewhere.
If move_lora() expects an existing path, passing the basename will raise “file not found” or silently skip the move.
- success = move_lora(file.name, request.username if hasattr(request, 'username') else "default")
+ # `file` is already a `Path`/str pointing to the temp file on disk.
+ success = move_lora(str(file), request.username if hasattr(request, 'username') else "default")📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def handle_lora_upload(file, request: gr.Request): | |
| if file is None: | |
| return | |
| success = move_lora(file.name, request.username if hasattr(request, 'username') else "default") | |
| if success: | |
| print(f"Successfully moved LoRA file: {file.name}") | |
| else: | |
| print(f"Failed to move LoRA file: {file.name}") | |
| def handle_lora_upload(file, request: gr.Request): | |
| if file is None: | |
| return | |
| # `file` is already a `Path`/str pointing to the temp file on disk. | |
| success = move_lora(str(file), request.username if hasattr(request, 'username') else "default") | |
| if success: | |
| print(f"Successfully moved LoRA file: {file.name}") | |
| else: | |
| print(f"Failed to move LoRA file: {file.name}") |
🤖 Prompt for AI Agents
In webui.py around lines 881 to 888, the function handle_lora_upload passes only
the basename file.name to move_lora, but move_lora requires the actual file path
of the uploaded file. Update the call to move_lora to pass the full temporary
file path of the uploaded file instead of just file.name, ensuring the file
exists and can be moved correctly.
| paths_performance_loras = [f'{path_models_root}/loras/performance/', '../models/loras/performance/'] | ||
| path_embeddings = get_dir_or_set_default('path_embeddings', f'{path_models_root}/embeddings/') |
There was a problem hiding this comment.
🛠️ Refactor suggestion
paths_performance_loras directories are never created
paths_performance_loras is assigned a raw list; unlike every other path list, it bypasses get_dir_or_set_default, so the folder may not exist. Recommend:
-paths_performance_loras = [f'{path_models_root}/loras/performance/', '../models/loras/performance/']
+paths_performance_loras = get_dir_or_set_default(
+ 'path_performance_loras',
+ [f'{path_models_root}/loras/performance/', '../models/loras/performance/'],
+ as_array=True,
+ make_directory=True
+)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| paths_performance_loras = [f'{path_models_root}/loras/performance/', '../models/loras/performance/'] | |
| path_embeddings = get_dir_or_set_default('path_embeddings', f'{path_models_root}/embeddings/') | |
| paths_performance_loras = get_dir_or_set_default( | |
| 'path_performance_loras', | |
| [f'{path_models_root}/loras/performance/', '../models/loras/performance/'], | |
| as_array=True, | |
| make_directory=True | |
| ) | |
| path_embeddings = get_dir_or_set_default('path_embeddings', f'{path_models_root}/embeddings/') |
🤖 Prompt for AI Agents
In modules/config.py around lines 247 to 248, the variable
paths_performance_loras is assigned a list of directory paths directly without
ensuring these directories exist. To fix this, replace the raw list assignment
with calls to get_dir_or_set_default for each path to guarantee the directories
are created or set to defaults, consistent with other path variables.
| def get_loras_path() -> str: | ||
| global config_dict | ||
| path_loras = [f'{path_models_root}/loras/', '../models/loras/'] | ||
| if args_manager.args.path_loras: | ||
| if isinstance(args_manager.args.path_loras, list): | ||
| path_loras = args_manager.args.path_loras | ||
| else: | ||
| path_loras = [args_manager.args.path_loras, args_manager.args.path_loras] | ||
| config_dict['path_loras'] = args_manager.args.path_loras | ||
| elif 'path_loras' in config_dict and config_dict['path_loras']: | ||
| if isinstance(config_dict['path_loras'], list): | ||
| path_loras = config_dict['path_loras'] | ||
| else: | ||
| path_loras = [config_dict['path_loras'], config_dict['path_loras']] | ||
|
|
||
| path_loras = get_dir_or_set_default('path_loras', path_loras, True) | ||
|
|
||
| print(f'Loras will be stored in path: {path_loras}') | ||
| return path_loras |
There was a problem hiding this comment.
🛠️ Refactor suggestion
Fix type-hint, ensure directory creation & align with new “common” sub-folder
Several issues in get_loras_path() can bite later:
- The function returns a
list[str], not a singlestr. - Default paths should already include the
common/sub-directory introduced elsewhere – right now they point one level too high. get_dir_or_set_default()is only invoked in theelifbranch, so when the CLI flag is used or when defaults are taken, the folders are never validated / created.- Because of (3) you may hit “file not found” errors when users try first-time uploads.
A compact patch:
-def get_loras_path() -> str:
+def get_loras_path() -> list[str]:
@@
- path_loras = [f'{path_models_root}/loras/', '../models/loras/']
+ # Default to the new 'common' namespace
+ path_loras = [f'{path_models_root}/loras/common/', '../models/loras/common/']
@@
- config_dict['path_loras'] = args_manager.args.path_loras
+ config_dict['path_loras'] = args_manager.args.path_loras
@@
- elif 'path_loras' in config_dict and config_dict['path_loras']:
+ elif 'path_loras' in config_dict and config_dict['path_loras']:
@@
-
- path_loras = get_dir_or_set_default('path_loras', path_loras, True)
-
- print(f'Loras will be stored in path: {path_loras}')
- return path_loras
+ # Always validate / create directories
+ path_loras = get_dir_or_set_default('path_loras', path_loras, True, make_directory=True)
+ print(f'Loras will be stored in path: {path_loras}')
+ return path_loras📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def get_loras_path() -> str: | |
| global config_dict | |
| path_loras = [f'{path_models_root}/loras/', '../models/loras/'] | |
| if args_manager.args.path_loras: | |
| if isinstance(args_manager.args.path_loras, list): | |
| path_loras = args_manager.args.path_loras | |
| else: | |
| path_loras = [args_manager.args.path_loras, args_manager.args.path_loras] | |
| config_dict['path_loras'] = args_manager.args.path_loras | |
| elif 'path_loras' in config_dict and config_dict['path_loras']: | |
| if isinstance(config_dict['path_loras'], list): | |
| path_loras = config_dict['path_loras'] | |
| else: | |
| path_loras = [config_dict['path_loras'], config_dict['path_loras']] | |
| path_loras = get_dir_or_set_default('path_loras', path_loras, True) | |
| print(f'Loras will be stored in path: {path_loras}') | |
| return path_loras | |
| def get_loras_path() -> list[str]: | |
| global config_dict | |
| # Default to the new 'common' namespace | |
| path_loras = [f'{path_models_root}/loras/common/', '../models/loras/common/'] | |
| if args_manager.args.path_loras: | |
| if isinstance(args_manager.args.path_loras, list): | |
| path_loras = args_manager.args.path_loras | |
| else: | |
| path_loras = [args_manager.args.path_loras, args_manager.args.path_loras] | |
| config_dict['path_loras'] = args_manager.args.path_loras | |
| elif 'path_loras' in config_dict and config_dict['path_loras']: | |
| if isinstance(config_dict['path_loras'], list): | |
| path_loras = config_dict['path_loras'] | |
| else: | |
| path_loras = [config_dict['path_loras'], config_dict['path_loras']] | |
| # Always validate / create directories | |
| path_loras = get_dir_or_set_default('path_loras', path_loras, True, make_directory=True) | |
| print(f'Loras will be stored in path: {path_loras}') | |
| return path_loras |
🤖 Prompt for AI Agents
In modules/config.py around lines 166 to 184, the get_loras_path function has an
incorrect return type hint (should be list of strings, not a single string),
default paths missing the "common/" sub-folder, and directory creation is only
triggered in the elif branch. Fix by changing the return type hint to list[str],
update default paths to include the "common/" sub-folder, and ensure
get_dir_or_set_default is called regardless of whether the path comes from CLI
args, config, or defaults to validate and create directories as needed.
| def downloading_sdxl_lcm_lora(): | ||
| load_file_from_url( | ||
| url='https://huggingface.co/lllyasviel/misc/resolve/main/sdxl_lcm_lora.safetensors', | ||
| model_dir=paths_loras[0], | ||
| model_dir=paths_performance_loras[0], | ||
| file_name=modules.flags.PerformanceLoRA.EXTREME_SPEED.value | ||
| ) | ||
| return modules.flags.PerformanceLoRA.EXTREME_SPEED.value | ||
| return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}" | ||
|
|
||
|
|
||
| def downloading_sdxl_lightning_lora(): | ||
| load_file_from_url( | ||
| url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_lightning_4step_lora.safetensors', | ||
| model_dir=paths_loras[0], | ||
| model_dir=paths_performance_loras[0], | ||
| file_name=modules.flags.PerformanceLoRA.LIGHTNING.value | ||
| ) | ||
| return modules.flags.PerformanceLoRA.LIGHTNING.value | ||
| return f"{modules.flags.PerformanceLoRA.LIGHTNING.value}" | ||
|
|
||
|
|
||
| def downloading_sdxl_hyper_sd_lora(): | ||
| load_file_from_url( | ||
| url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_hyper_sd_4step_lora.safetensors', | ||
| model_dir=paths_loras[0], | ||
| model_dir=paths_performance_loras[0], | ||
| file_name=modules.flags.PerformanceLoRA.HYPER_SD.value | ||
| ) | ||
| return modules.flags.PerformanceLoRA.HYPER_SD.value | ||
| return f"{modules.flags.PerformanceLoRA.HYPER_SD.value}" | ||
|
|
There was a problem hiding this comment.
💡 Verification agent
🧩 Analysis chain
Download helpers now return just the file-name, breaking previous callers
downloading_sdxl_*_lora() used to return the full path; after the change it returns only the bare filename:
return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}"Down-stream code expecting os.path.join(dir, file) will now fail to locate the file.
Either restore the old behaviour:
- return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}"
+ return os.path.join(paths_performance_loras[0], modules.flags.PerformanceLoRA.EXTREME_SPEED.value)and similarly for the other two helpers, or audit every call-site.
Also ensure the parent directory exists (issue noted in a previous comment).
🏁 Script executed:
#!/bin/bash
set -e
echo "🔍 Searching for call‐sites of downloading_sdxl_*_lora()..."
rg -n "downloading_sdxl_(lcm|lightning|hyper)_lora" .
echo
echo "🔍 Locating load_file_from_url definition to inspect its return value..."
rg -n "def load_file_from_url" -A10 .Length of output: 4600
Restore full file path in download helpers
The downloading_sdxl_*_lora() functions currently discard the path returned by load_file_from_url and only return the bare filename, but downstream code (e.g. in modules/async_worker.py) expects the full file path. Update each helper to return the result of load_file_from_url and ensure its target directory exists:
• File: modules/config.py (around lines 1075–1100)
def downloading_sdxl_lcm_lora():
- load_file_from_url(
+ return load_file_from_url(
url='https://huggingface.co/lllyasviel/misc/resolve/main/sdxl_lcm_lora.safetensors',
model_dir=paths_performance_loras[0],
file_name=modules.flags.PerformanceLoRA.EXTREME_SPEED.value
)
- return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}"Apply the same pattern to downloading_sdxl_lightning_lora() and downloading_sdxl_hyper_sd_lora().
Also, before calling load_file_from_url, ensure paths_performance_loras[0] exists (e.g., os.makedirs(..., exist_ok=True)).
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| def downloading_sdxl_lcm_lora(): | |
| load_file_from_url( | |
| url='https://huggingface.co/lllyasviel/misc/resolve/main/sdxl_lcm_lora.safetensors', | |
| model_dir=paths_loras[0], | |
| model_dir=paths_performance_loras[0], | |
| file_name=modules.flags.PerformanceLoRA.EXTREME_SPEED.value | |
| ) | |
| return modules.flags.PerformanceLoRA.EXTREME_SPEED.value | |
| return f"{modules.flags.PerformanceLoRA.EXTREME_SPEED.value}" | |
| def downloading_sdxl_lightning_lora(): | |
| load_file_from_url( | |
| url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_lightning_4step_lora.safetensors', | |
| model_dir=paths_loras[0], | |
| model_dir=paths_performance_loras[0], | |
| file_name=modules.flags.PerformanceLoRA.LIGHTNING.value | |
| ) | |
| return modules.flags.PerformanceLoRA.LIGHTNING.value | |
| return f"{modules.flags.PerformanceLoRA.LIGHTNING.value}" | |
| def downloading_sdxl_hyper_sd_lora(): | |
| load_file_from_url( | |
| url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_hyper_sd_4step_lora.safetensors', | |
| model_dir=paths_loras[0], | |
| model_dir=paths_performance_loras[0], | |
| file_name=modules.flags.PerformanceLoRA.HYPER_SD.value | |
| ) | |
| return modules.flags.PerformanceLoRA.HYPER_SD.value | |
| return f"{modules.flags.PerformanceLoRA.HYPER_SD.value}" | |
| def downloading_sdxl_lcm_lora(): | |
| return load_file_from_url( | |
| url='https://huggingface.co/lllyasviel/misc/resolve/main/sdxl_lcm_lora.safetensors', | |
| model_dir=paths_performance_loras[0], | |
| file_name=modules.flags.PerformanceLoRA.EXTREME_SPEED.value | |
| ) | |
| def downloading_sdxl_lightning_lora(): | |
| return load_file_from_url( | |
| url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_lightning_4step_lora.safetensors', | |
| model_dir=paths_performance_loras[0], | |
| file_name=modules.flags.PerformanceLoRA.LIGHTNING.value | |
| ) | |
| def downloading_sdxl_hyper_sd_lora(): | |
| return load_file_from_url( | |
| url='https://huggingface.co/mashb1t/misc/resolve/main/sdxl_hyper_sd_4step_lora.safetensors', | |
| model_dir=paths_performance_loras[0], | |
| file_name=modules.flags.PerformanceLoRA.HYPER_SD.value | |
| ) |
🤖 Prompt for AI Agents
In modules/config.py around lines 1075 to 1100, the downloading_sdxl_*_lora
functions currently return only the filename string instead of the full file
path, which breaks downstream code expecting the full path. To fix this, modify
each function to first ensure the directory paths_performance_loras[0] exists by
creating it if necessary, then call load_file_from_url and return its full path
result instead of just the filename. Apply this fix consistently to
downloading_sdxl_lcm_lora, downloading_sdxl_lightning_lora, and
downloading_sdxl_hyper_sd_lora.
Summary by CodeRabbit
New Features
Improvements
Bug Fixes
Documentation