Skip to content

Conversation

@rattus128
Copy link
Contributor

In the lowvram case, this now does its math in the model dtype post de-quantization. Account for that. The patching was also put back on the compute stream getting it off-peak so relax the MATH_FACTOR to only x2 so get out of the worst-case assumption of everything peaking at once.

RTX3060, flux2 fp8 with Lora:

Before:

image

After:

image

In the lowvram case, this now does its math in the model dtype in the
post de-quantization domain. Account for that. The patching was also
put back on the compute stream getting it off-peak so relax the
MATH_FACTOR to only x2 so get out of the worst-case assumption of
everything peaking at once.
@comfyanonymous comfyanonymous merged commit 60ee574 into comfyanonymous:master Dec 8, 2025
10 checks passed
@LukeG89
Copy link

LukeG89 commented Dec 8, 2025

@rattus128 Something broke (at least with Qwen Image Edit). If I add a LoRA, it throws this error.

ComfyUI Error Report

Error Details

  • Node ID: 446
  • Node Type: KSampler
  • Exception Type: AttributeError
  • Exception Message: 'NoneType' object has no attribute 'itemsize'

Stack Trace

  File "E:\ComfyUI\ComfyUI\execution.py", line 515, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\execution.py", line 329, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\execution.py", line 303, in _async_map_node_over_list
    await process_inputs(input_dict, i)

  File "E:\ComfyUI\ComfyUI\execution.py", line 291, in process_inputs
    result = f(**inputs)
             ^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\nodes.py", line 1538, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\nodes.py", line 1505, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\sample.py", line 60, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1163, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1053, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1035, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 984, in outer_sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 130, in prepare_sampling
    return executor.execute(model, noise_shape, conds, model_options=model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 138, in _prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required + inference_memory, minimum_memory_required=minimum_memory_required + inference_memory)

  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 701, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)

  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 506, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)

  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 536, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 968, in partially_load
    raise e

  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 965, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)

  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 681, in load
    loading = self._load_list()
              ^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 668, in _load_list
    module_offload_mem += low_vram_patch_estimate_vram(self.model, weight_key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 142, in low_vram_patch_estimate_vram
    return weight.numel() * model_dtype.itemsize * LOWVRAM_PATCH_ESTIMATE_MATH_FACTOR
                            ^^^^^^^^^^^^^^^^^^^^

System Information

  • ComfyUI Version: 0.3.76
  • Arguments: ComfyUI\main.py --disable-api-nodes --fp32-vae --preview-size 2048
  • OS: win32
  • Python Version: 3.12.7 (tags/v3.12.7:0b05ead, Oct 1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
  • Embedded Python: true
  • PyTorch Version: 2.9.1+cu130

Devices

  • Name: cuda:0 NVIDIA GeForce RTX 3070 Ti Laptop GPU : cudaMallocAsync
    • Type: cuda
    • VRAM Total: 8589410304
    • VRAM Free: 7379746816
    • Torch VRAM Total: 67108864
    • Torch VRAM Free: 57540608

Logs

2025-12-08T23:02:00.983543 - [START] Security scan2025-12-08T23:02:00.983543 - 
2025-12-08T23:02:02.086135 - [DONE] Security scan2025-12-08T23:02:02.086135 - 
2025-12-08T23:02:02.180269 - ## ComfyUI-Manager: installing dependencies done.2025-12-08T23:02:02.180269 - 
2025-12-08T23:02:02.180269 - ** ComfyUI startup time:2025-12-08T23:02:02.181255 -  2025-12-08T23:02:02.181255 - 2025-12-08 23:02:02.1802025-12-08T23:02:02.181255 - 
2025-12-08T23:02:02.181255 - ** Platform:2025-12-08T23:02:02.181255 -  2025-12-08T23:02:02.181255 - Windows2025-12-08T23:02:02.181255 - 
2025-12-08T23:02:02.181255 - ** Python version:2025-12-08T23:02:02.181255 -  2025-12-08T23:02:02.181255 - 3.12.7 (tags/v3.12.7:0b05ead, Oct  1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]2025-12-08T23:02:02.181255 - 
2025-12-08T23:02:02.181255 - ** Python executable:2025-12-08T23:02:02.181255 -  2025-12-08T23:02:02.181255 - E:\ComfyUI\python_embeded\python.exe2025-12-08T23:02:02.181255 - 
2025-12-08T23:02:02.181255 - ** ComfyUI Path:2025-12-08T23:02:02.182252 -  2025-12-08T23:02:02.182252 - E:\ComfyUI\ComfyUI2025-12-08T23:02:02.182252 - 
2025-12-08T23:02:02.182252 - ** ComfyUI Base Folder Path:2025-12-08T23:02:02.182252 -  2025-12-08T23:02:02.182252 - E:\ComfyUI\ComfyUI2025-12-08T23:02:02.182252 - 
2025-12-08T23:02:02.182252 - ** User directory:2025-12-08T23:02:02.182252 -  2025-12-08T23:02:02.182252 - E:\ComfyUI\ComfyUI\user2025-12-08T23:02:02.182252 - 
2025-12-08T23:02:02.182252 - ** ComfyUI-Manager config path:2025-12-08T23:02:02.182252 -  2025-12-08T23:02:02.182252 - E:\ComfyUI\ComfyUI\user\__manager\config.ini2025-12-08T23:02:02.182252 - 
2025-12-08T23:02:02.194417 - ** Log path:2025-12-08T23:02:02.194417 -  2025-12-08T23:02:02.194417 - E:\ComfyUI\ComfyUI\user\comfyui.log2025-12-08T23:02:02.194417 - 
2025-12-08T23:02:03.284195 - 
Prestartup times for custom nodes:
2025-12-08T23:02:03.284195 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy
2025-12-08T23:02:03.284195 -    2.8 seconds: E:\ComfyUI\ComfyUI\custom_nodes\comfyui-manager
2025-12-08T23:02:03.284195 - 
2025-12-08T23:02:04.455220 - Checkpoint files will always be loaded safely.
2025-12-08T23:02:04.537813 - Total VRAM 8192 MB, total RAM 65277 MB
2025-12-08T23:02:04.537813 - pytorch version: 2.9.1+cu130
2025-12-08T23:02:04.537813 - Set vram state to: NORMAL_VRAM
2025-12-08T23:02:04.537813 - Device: cuda:0 NVIDIA GeForce RTX 3070 Ti Laptop GPU : cudaMallocAsync
2025-12-08T23:02:04.548177 - Using async weight offloading with 2 streams
2025-12-08T23:02:04.548177 - Enabled pinned memory 29374.0
2025-12-08T23:02:04.597553 - working around nvidia conv3d memory bug.
2025-12-08T23:02:05.410083 - Using pytorch attention
2025-12-08T23:02:06.752987 - Python version: 3.12.7 (tags/v3.12.7:0b05ead, Oct  1 2024, 03:06:41) [MSC v.1941 64 bit (AMD64)]
2025-12-08T23:02:06.752987 - ComfyUI version: 0.3.76
2025-12-08T23:02:06.790746 - ComfyUI frontend version: 1.33.10
2025-12-08T23:02:06.790746 - [Prompt Server] web root: E:\ComfyUI\python_embeded\Lib\site-packages\comfyui_frontend_package\static
2025-12-08T23:02:07.241955 - Total VRAM 8192 MB, total RAM 65277 MB
2025-12-08T23:02:07.241955 - pytorch version: 2.9.1+cu130
2025-12-08T23:02:07.241955 - Set vram state to: NORMAL_VRAM
2025-12-08T23:02:07.241955 - Device: cuda:0 NVIDIA GeForce RTX 3070 Ti Laptop GPU : cudaMallocAsync
2025-12-08T23:02:07.259551 - Using async weight offloading with 2 streams
2025-12-08T23:02:07.259551 - Enabled pinned memory 29374.0
2025-12-08T23:02:07.313568 - ### Loading Custom Nodes: IMGNR/Utils Pack (CatchEditTextNode, PreviewImageBase64Node)2025-12-08T23:02:07.313568 - 
2025-12-08T23:02:07.313568 - ### Loading: ComfyUI-Impact-Pack (V8.28)
2025-12-08T23:02:07.447080 - [Impact Pack] custom_wildcards path not found: E:\ComfyUI\custom_nodes\comfyui-impact-pack\custom_wildcards. Using default path.
2025-12-08T23:02:07.447080 - [Impact Pack] Wildcard total size (0.00 MB) is within cache limit (50.00 MB). Using full cache mode.
2025-12-08T23:02:07.447080 - [Impact Pack] Wildcards loading done.
2025-12-08T23:02:07.447080 - ### Loading: ComfyUI-Impact-Subpack (V1.3.5)
2025-12-08T23:02:07.447080 - [Impact Pack/Subpack] Using folder_paths to determine whitelist path: E:\ComfyUI\ComfyUI\user\default\ComfyUI-Impact-Subpack\model-whitelist.txt
2025-12-08T23:02:07.447080 - [Impact Pack/Subpack] Ensured whitelist directory exists: E:\ComfyUI\ComfyUI\user\default\ComfyUI-Impact-Subpack
2025-12-08T23:02:07.447080 - [Impact Pack/Subpack] Loaded 0 model(s) from whitelist: E:\ComfyUI\ComfyUI\user\default\ComfyUI-Impact-Subpack\model-whitelist.txt
2025-12-08T23:02:07.508109 - [Impact Subpack] ultralytics_bbox: E:\ComfyUI\ComfyUI\models\ultralytics\bbox
2025-12-08T23:02:07.508109 - [Impact Subpack] ultralytics_segm: E:\ComfyUI\ComfyUI\models\ultralytics\segm
2025-12-08T23:02:07.510536 - ### Loading: ComfyUI-Inspire-Pack (V1.23)
2025-12-08T23:02:07.569426 - ### Loading: ComfyUI-Manager (V3.38.1)
2025-12-08T23:02:07.569426 - [ComfyUI-Manager] network_mode: public
2025-12-08T23:02:07.746333 - ### ComfyUI Version: v0.3.76-50-g935493f6 | Released on '2025-12-08'
2025-12-08T23:02:08.014598 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/alter-list.json
2025-12-08T23:02:08.030386 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/model-list.json
2025-12-08T23:02:08.030386 - [�[92mWAS LMStudio Easy-Query�[0m] has loaded 8 nodes.2025-12-08T23:02:08.030386 - 
2025-12-08T23:02:08.030386 - Nodes: �[1mLM Studio Model�[0m, �[1mLM Studio Query�[0m, �[1mLM Studio Easy-Caption�[0m, �[1mLM Studio Chat�[0m, �[1mLM Studio Options�[0m, �[1mWAS Load Image Directory�[0m, �[1mLM Studio Easy-Caption Dataset�[0m, �[1mLM Studio Caption Dataset�[0m2025-12-08T23:02:08.030386 - 
2025-12-08T23:02:08.061895 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/github-stats.json
2025-12-08T23:02:08.109393 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/extension-node-map.json
2025-12-08T23:02:08.165843 - [ComfyUI-Manager] default cache updated: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json
2025-12-08T23:02:08.315948 - Using pytorch attention
2025-12-08T23:02:08.536728 - (RES4LYF) Init2025-12-08T23:02:08.536728 - 
2025-12-08T23:02:08.536728 - (RES4LYF) Importing beta samplers.2025-12-08T23:02:08.536728 - 
2025-12-08T23:02:08.558840 - (RES4LYF) Importing legacy samplers.2025-12-08T23:02:08.558840 - 
2025-12-08T23:02:08.568328 - 
2025-12-08T23:02:08.568328 - �[92m[rgthree-comfy] Loaded 48 extraordinary nodes. 🎉�[0m2025-12-08T23:02:08.568328 - 
2025-12-08T23:02:08.568328 - 
2025-12-08T23:02:08.568328 - �[33m[rgthree-comfy] ComfyUI's new Node 2.0 rendering may be incompatible with some rgthree-comfy nodes and features, breaking some rendering as well as losing the ability to access a node's properties (a vital part of many nodes). It also appears to run MUCH more slowly spiking CPU usage and causing jankiness and unresponsiveness, especially with large workflows. Personally I am not planning to use the new Nodes 2.0 and, unfortunately, am not able to invest the time to investigate and overhaul rgthree-comfy where needed. If you have issues when Nodes 2.0 is enabled, I'd urge you to switch it off as well and join me in hoping ComfyUI is not planning to deprecate the existing, stable canvas rendering all together.
�[0m2025-12-08T23:02:08.568328 - 
2025-12-08T23:02:09.552866 - �[34mWAS Node Suite: �[0mOpenCV Python FFMPEG support is enabled�[0m2025-12-08T23:02:09.552866 - 
2025-12-08T23:02:09.553955 - �[34mWAS Node Suite �[93mWarning: �[0m`ffmpeg_bin_path` is not set in `E:\ComfyUI\ComfyUI\custom_nodes\was-ns\was_suite_config.json` config file. Will attempt to use system ffmpeg binaries if available.�[0m2025-12-08T23:02:09.553955 - 
2025-12-08T23:02:09.964833 - �[34mWAS Node Suite: �[0mFinished.�[0m �[32mLoaded�[0m �[0m220�[0m �[32mnodes successfully.�[0m2025-12-08T23:02:09.964833 - 
2025-12-08T23:02:09.964833 - 
	�[3m�[93m"Believe you can and you're halfway there."�[0m�[3m - Theodore Roosevelt�[0m
2025-12-08T23:02:09.964833 - 
2025-12-08T23:02:09.964833 - 
Import times for custom nodes:
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\loadImageWithSubfolders.py
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\websocket_image_save.py
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\quick-connections
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-KiraLoraEQ
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\Comfyui-Resolution-Master
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-IMGNR-Utils
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_SigmoidOffsetScheduler
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-PainterLongVideo
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-PainterI2V
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\ComfyUI_LMStudio_EasyQuery
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\comfyui-custom-scripts
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-bleh
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\comfyui-kjnodes
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\rgthree-comfy
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\comfyui_essentials
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\comfyui-videohelpersuite
2025-12-08T23:02:09.964833 -    0.0 seconds: E:\ComfyUI\ComfyUI\custom_nodes\comfyui-inspire-pack
2025-12-08T23:02:09.964833 -    0.1 seconds: E:\ComfyUI\ComfyUI\custom_nodes\comfyui-impact-subpack
2025-12-08T23:02:09.964833 -    0.1 seconds: E:\ComfyUI\ComfyUI\custom_nodes\ComfyUI-Impact-Pack
2025-12-08T23:02:09.964833 -    0.4 seconds: E:\ComfyUI\ComfyUI\custom_nodes\comfyui-manager
2025-12-08T23:02:09.964833 -    0.5 seconds: E:\ComfyUI\ComfyUI\custom_nodes\res4lyf
2025-12-08T23:02:09.964833 -    1.4 seconds: E:\ComfyUI\ComfyUI\custom_nodes\was-ns
2025-12-08T23:02:09.964833 - 
2025-12-08T23:02:10.185633 - Context impl SQLiteImpl.
2025-12-08T23:02:10.187428 - Will assume non-transactional DDL.
2025-12-08T23:02:10.187428 - No target revision found.
2025-12-08T23:02:10.217041 - Starting server

2025-12-08T23:02:10.217041 - To see the GUI go to: http://127.0.0.1:8188
2025-12-08T23:02:11.684912 - FETCH ComfyRegistry Data: 5/1102025-12-08T23:02:11.685939 - 
2025-12-08T23:02:12.215583 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-08T23:02:12.215929 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /extensions/core/clipspace.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-08T23:02:12.218484 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /extensions/core/groupNode.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-08T23:02:12.219485 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /extensions/core/widgetInputs.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-08T23:02:12.710361 - [Inspire Pack] IPAdapterPlus is not installed.
2025-12-08T23:02:13.046756 - {"client": "<lmstudio.sync_api.Client object at 0x0000020603200230>", "event": "Websocket handling thread started", "thread_id": "Thread-13"}
2025-12-08T23:02:13.258828 - HTTP Request: GET http://127.0.0.1:41343/lmstudio-greeting "HTTP/1.1 200 OK"
2025-12-08T23:02:13.474485 - HTTP Request: GET ws://127.0.0.1:41343/system "HTTP/1.1 101 Switching Protocols"
2025-12-08T23:02:13.474485 - {"event": "Websocket session established", "ws_url": "ws://127.0.0.1:41343/system"}
2025-12-08T23:02:13.857329 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui/components/buttonGroup.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-08T23:02:13.864552 - [DEPRECATION WARNING] Detected import of deprecated legacy API: /scripts/ui/components/button.js. This is likely caused by a custom node extension using outdated APIs. Please update your extensions or contact the extension author for an updated version.
2025-12-08T23:02:15.257912 - FETCH ComfyRegistry Data: 10/1102025-12-08T23:02:15.257912 - 
2025-12-08T23:02:18.745789 - FETCH ComfyRegistry Data: 15/1102025-12-08T23:02:18.745789 - 
2025-12-08T23:02:22.359687 - FETCH ComfyRegistry Data: 20/1102025-12-08T23:02:22.360836 - 
2025-12-08T23:02:25.972236 - FETCH ComfyRegistry Data: 25/1102025-12-08T23:02:25.973235 - 
2025-12-08T23:02:29.622612 - FETCH ComfyRegistry Data: 30/1102025-12-08T23:02:29.622612 - 
2025-12-08T23:02:33.173591 - FETCH ComfyRegistry Data: 35/1102025-12-08T23:02:33.173591 - 
2025-12-08T23:02:36.731205 - FETCH ComfyRegistry Data: 40/1102025-12-08T23:02:36.731205 - 
2025-12-08T23:02:40.318098 - FETCH ComfyRegistry Data: 45/1102025-12-08T23:02:40.319057 - 
2025-12-08T23:02:43.798931 - FETCH ComfyRegistry Data: 50/1102025-12-08T23:02:43.799938 - 
2025-12-08T23:02:47.445804 - FETCH ComfyRegistry Data: 55/1102025-12-08T23:02:47.445804 - 
2025-12-08T23:02:51.580064 - FETCH ComfyRegistry Data: 60/1102025-12-08T23:02:51.580064 - 
2025-12-08T23:02:55.145580 - FETCH ComfyRegistry Data: 65/1102025-12-08T23:02:55.146664 - 
2025-12-08T23:02:58.723845 - FETCH ComfyRegistry Data: 70/1102025-12-08T23:02:58.723845 - 
2025-12-08T23:03:02.822195 - FETCH ComfyRegistry Data: 75/1102025-12-08T23:03:02.822195 - 
2025-12-08T23:03:02.929337 - got prompt
2025-12-08T23:03:03.042398 - Using pytorch attention in VAE
2025-12-08T23:03:03.043386 - Using pytorch attention in VAE
2025-12-08T23:03:03.341437 - VAE load device: cuda:0, offload device: cpu, dtype: torch.float32
2025-12-08T23:03:03.581268 - Requested to load WanVAE
2025-12-08T23:03:03.584152 - 0 models unloaded.
2025-12-08T23:03:03.615670 - loaded partially; 0.00 MB usable, 0.00 MB loaded, 484.00 MB offloaded, 45.57 MB buffer reserved, lowvram patches: 0
2025-12-08T23:03:06.463550 - FETCH ComfyRegistry Data: 80/1102025-12-08T23:03:06.463550 - 
2025-12-08T23:03:10.572204 - FETCH ComfyRegistry Data: 85/1102025-12-08T23:03:10.572204 - 
2025-12-08T23:03:14.140706 - FETCH ComfyRegistry Data: 90/1102025-12-08T23:03:14.143874 - 
2025-12-08T23:03:17.640839 - FETCH ComfyRegistry Data: 95/1102025-12-08T23:03:17.640839 - 
2025-12-08T23:03:20.830724 - CLIP/text encoder model load device: cuda:0, offload device: cpu, current: cpu, dtype: torch.float16
2025-12-08T23:03:21.267929 - FETCH ComfyRegistry Data: 100/1102025-12-08T23:03:21.267929 - 
2025-12-08T23:03:26.226237 - FETCH ComfyRegistry Data: 105/1102025-12-08T23:03:26.226237 - 
2025-12-08T23:03:28.258027 - model weight dtype torch.bfloat16, manual cast: None
2025-12-08T23:03:28.258027 - model_type FLUX
2025-12-08T23:03:29.808653 - FETCH ComfyRegistry Data: 110/1102025-12-08T23:03:29.808653 - 
2025-12-08T23:03:30.309120 - FETCH ComfyRegistry Data [DONE]2025-12-08T23:03:30.309120 - 
2025-12-08T23:03:30.436953 - [ComfyUI-Manager] default cache updated: https://api.comfy.org/nodes
2025-12-08T23:03:30.487901 - FETCH DATA from: https://raw.githubusercontent.com/ltdrdata/ComfyUI-Manager/main/custom-node-list.json2025-12-08T23:03:30.487901 - 2025-12-08T23:03:30.577290 -  [DONE]2025-12-08T23:03:30.577290 - 
2025-12-08T23:03:30.635273 - [ComfyUI-Manager] broken item:{'author': 'rjgoif', 'title': 'Img Label Tools', 'id': 'Img-Label-Tools', 'reference': 'https://github.com/rjgoif/ComfyUI-Img-Label-Tools', 'install_type': 'git-clone', 'description': 'Tools to help annotate images for sharing on Reddit, Discord, etc.'}
2025-12-08T23:03:30.671930 - [ComfyUI-Manager] All startup tasks have been completed.
2025-12-08T23:04:28.990690 - Requested to load QwenImageTEModel_
2025-12-08T23:04:58.666705 - loaded partially; 5637.80 MB usable, 5249.30 MB loaded, 9526.90 MB offloaded, 388.50 MB buffer reserved, lowvram patches: 0
2025-12-08T23:05:00.157792 - loaded partially; 5620.68 MB usable, 5231.94 MB loaded, 9544.61 MB offloaded, 388.50 MB buffer reserved, lowvram patches: 0
2025-12-08T23:05:01.009586 - Requested to load QwenImage
2025-12-08T23:05:03.338504 - !!! Exception during processing !!! 'NoneType' object has no attribute 'itemsize'
2025-12-08T23:05:03.758083 - Traceback (most recent call last):
  File "E:\ComfyUI\ComfyUI\execution.py", line 515, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 329, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 303, in _async_map_node_over_list
    await process_inputs(input_dict, i)
  File "E:\ComfyUI\ComfyUI\execution.py", line 291, in process_inputs
    result = f(**inputs)
             ^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1538, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1505, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sample.py", line 60, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1163, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1053, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1035, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 984, in outer_sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 130, in prepare_sampling
    return executor.execute(model, noise_shape, conds, model_options=model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 138, in _prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required + inference_memory, minimum_memory_required=minimum_memory_required + inference_memory)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 701, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 506, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 536, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 968, in partially_load
    raise e
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 965, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 681, in load
    loading = self._load_list()
              ^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 668, in _load_list
    module_offload_mem += low_vram_patch_estimate_vram(self.model, weight_key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 142, in low_vram_patch_estimate_vram
    return weight.numel() * model_dtype.itemsize * LOWVRAM_PATCH_ESTIMATE_MATH_FACTOR
                            ^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'itemsize'

2025-12-08T23:05:03.778520 - Prompt executed in 120.83 seconds
2025-12-08T23:06:04.653718 - got prompt
2025-12-08T23:06:04.717234 - Requested to load QwenImage
2025-12-08T23:06:04.747372 - !!! Exception during processing !!! 'NoneType' object has no attribute 'itemsize'
2025-12-08T23:06:04.748276 - Traceback (most recent call last):
  File "E:\ComfyUI\ComfyUI\execution.py", line 515, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 329, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 303, in _async_map_node_over_list
    await process_inputs(input_dict, i)
  File "E:\ComfyUI\ComfyUI\execution.py", line 291, in process_inputs
    result = f(**inputs)
             ^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1538, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1505, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sample.py", line 60, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1163, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1053, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1035, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 984, in outer_sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 130, in prepare_sampling
    return executor.execute(model, noise_shape, conds, model_options=model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 138, in _prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required + inference_memory, minimum_memory_required=minimum_memory_required + inference_memory)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 701, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 506, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 536, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 968, in partially_load
    raise e
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 965, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 681, in load
    loading = self._load_list()
              ^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 668, in _load_list
    module_offload_mem += low_vram_patch_estimate_vram(self.model, weight_key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 142, in low_vram_patch_estimate_vram
    return weight.numel() * model_dtype.itemsize * LOWVRAM_PATCH_ESTIMATE_MATH_FACTOR
                            ^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'itemsize'

2025-12-08T23:06:04.751633 - Prompt executed in 0.06 seconds
2025-12-08T23:06:18.941502 - got prompt
2025-12-08T23:06:19.022784 - Requested to load QwenImage
2025-12-08T23:06:19.062557 - !!! Exception during processing !!! 'NoneType' object has no attribute 'itemsize'
2025-12-08T23:06:19.064561 - Traceback (most recent call last):
  File "E:\ComfyUI\ComfyUI\execution.py", line 515, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 329, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 303, in _async_map_node_over_list
    await process_inputs(input_dict, i)
  File "E:\ComfyUI\ComfyUI\execution.py", line 291, in process_inputs
    result = f(**inputs)
             ^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1538, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1505, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sample.py", line 60, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1163, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1053, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1035, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 984, in outer_sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 130, in prepare_sampling
    return executor.execute(model, noise_shape, conds, model_options=model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 138, in _prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required + inference_memory, minimum_memory_required=minimum_memory_required + inference_memory)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 701, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 506, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 536, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 968, in partially_load
    raise e
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 965, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 681, in load
    loading = self._load_list()
              ^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 668, in _load_list
    module_offload_mem += low_vram_patch_estimate_vram(self.model, weight_key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 142, in low_vram_patch_estimate_vram
    return weight.numel() * model_dtype.itemsize * LOWVRAM_PATCH_ESTIMATE_MATH_FACTOR
                            ^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'itemsize'

2025-12-08T23:06:19.067925 - Prompt executed in 0.11 seconds
2025-12-08T23:06:59.209040 - got prompt
2025-12-08T23:06:59.269801 - Requested to load QwenImage
2025-12-08T23:06:59.304836 - !!! Exception during processing !!! 'NoneType' object has no attribute 'itemsize'
2025-12-08T23:06:59.306843 - Traceback (most recent call last):
  File "E:\ComfyUI\ComfyUI\execution.py", line 515, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 329, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 303, in _async_map_node_over_list
    await process_inputs(input_dict, i)
  File "E:\ComfyUI\ComfyUI\execution.py", line 291, in process_inputs
    result = f(**inputs)
             ^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1538, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1505, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sample.py", line 60, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1163, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1053, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1035, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 984, in outer_sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 130, in prepare_sampling
    return executor.execute(model, noise_shape, conds, model_options=model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 138, in _prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required + inference_memory, minimum_memory_required=minimum_memory_required + inference_memory)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 701, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 506, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 536, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 968, in partially_load
    raise e
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 965, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 681, in load
    loading = self._load_list()
              ^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 668, in _load_list
    module_offload_mem += low_vram_patch_estimate_vram(self.model, weight_key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 142, in low_vram_patch_estimate_vram
    return weight.numel() * model_dtype.itemsize * LOWVRAM_PATCH_ESTIMATE_MATH_FACTOR
                            ^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'itemsize'

2025-12-08T23:06:59.308935 - Prompt executed in 0.08 seconds
2025-12-08T23:07:12.400969 - got prompt
2025-12-08T23:07:12.449884 - Requested to load QwenImage
2025-12-08T23:07:12.496614 - !!! Exception during processing !!! 'NoneType' object has no attribute 'itemsize'
2025-12-08T23:07:12.499742 - Traceback (most recent call last):
  File "E:\ComfyUI\ComfyUI\execution.py", line 515, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 329, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 303, in _async_map_node_over_list
    await process_inputs(input_dict, i)
  File "E:\ComfyUI\ComfyUI\execution.py", line 291, in process_inputs
    result = f(**inputs)
             ^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1538, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1505, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sample.py", line 60, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1163, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1053, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1035, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 984, in outer_sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 130, in prepare_sampling
    return executor.execute(model, noise_shape, conds, model_options=model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 138, in _prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required + inference_memory, minimum_memory_required=minimum_memory_required + inference_memory)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 701, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 506, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 536, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 968, in partially_load
    raise e
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 965, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 681, in load
    loading = self._load_list()
              ^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 668, in _load_list
    module_offload_mem += low_vram_patch_estimate_vram(self.model, weight_key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 142, in low_vram_patch_estimate_vram
    return weight.numel() * model_dtype.itemsize * LOWVRAM_PATCH_ESTIMATE_MATH_FACTOR
                            ^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'itemsize'

2025-12-08T23:07:12.502314 - Prompt executed in 0.08 seconds
2025-12-08T23:07:21.650620 - got prompt
2025-12-08T23:07:21.727728 - Requested to load QwenImageTEModel_
2025-12-08T23:07:24.025070 - loaded partially; 5618.68 MB usable, 5228.44 MB loaded, 9548.12 MB offloaded, 388.50 MB buffer reserved, lowvram patches: 0
2025-12-08T23:07:24.843973 - loaded partially; 5618.68 MB usable, 5228.44 MB loaded, 9548.12 MB offloaded, 388.50 MB buffer reserved, lowvram patches: 0
2025-12-08T23:07:25.631361 - Requested to load QwenImage
2025-12-08T23:07:30.238466 - Interrupting prompt 4bad7087-bef1-46e3-aa6c-872313d51369
2025-12-08T23:08:15.820679 - loaded partially; 4728.88 MB usable, 4395.76 MB loaded, 34572.14 MB offloaded, 324.11 MB buffer reserved, lowvram patches: 0
2025-12-08T23:08:15.944974 - 
  0%|                                                                                                                               | 0/30 [00:00<?, ?it/s]2025-12-08T23:08:15.980962 - 
  0%|                                                                                                                               | 0/30 [00:00<?, ?it/s]2025-12-08T23:08:15.980962 - 
2025-12-08T23:08:15.980962 - Processing interrupted
2025-12-08T23:08:15.982972 - Prompt executed in 54.31 seconds
2025-12-08T23:08:25.238343 - got prompt
2025-12-08T23:08:25.357945 - loaded partially; 4728.88 MB usable, 4395.76 MB loaded, 34572.14 MB offloaded, 324.11 MB buffer reserved, lowvram patches: 0
2025-12-08T23:08:25.370993 - (RES4LYF) rk_type: res_2s2025-12-08T23:08:25.370993 - 
2025-12-08T23:08:25.469069 - 
  0%|                                                                                                                               | 0/30 [00:00<?, ?it/s]2025-12-08T23:08:28.393552 - Interrupting prompt e2b75e8c-1ef1-4c3e-9d3b-e844c7027dec
2025-12-08T23:08:28.971411 - Processing interrupted
2025-12-08T23:08:28.972411 - Prompt executed in 3.71 seconds
2025-12-08T23:08:29.189947 - 
  0%|                                                                                                                               | 0/30 [00:03<?, ?it/s]2025-12-08T23:08:29.189947 - 
2025-12-08T23:08:35.624311 - got prompt
2025-12-08T23:08:36.865175 - Requested to load QwenImageTEModel_
2025-12-08T23:08:40.369290 - loaded partially; 5612.68 MB usable, 5222.56 MB loaded, 9554.00 MB offloaded, 388.50 MB buffer reserved, lowvram patches: 0
2025-12-08T23:08:41.186293 - loaded partially; 5612.68 MB usable, 5222.56 MB loaded, 9554.00 MB offloaded, 388.50 MB buffer reserved, lowvram patches: 0
2025-12-08T23:08:41.980778 - Requested to load QwenImage
2025-12-08T23:08:43.482793 - !!! Exception during processing !!! 'NoneType' object has no attribute 'itemsize'
2025-12-08T23:08:43.484778 - Traceback (most recent call last):
  File "E:\ComfyUI\ComfyUI\execution.py", line 515, in execute
    output_data, output_ui, has_subgraph, has_pending_tasks = await get_output_data(prompt_id, unique_id, obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                                                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 329, in get_output_data
    return_values = await _async_map_node_over_list(prompt_id, unique_id, obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb, v3_data=v3_data)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\execution.py", line 303, in _async_map_node_over_list
    await process_inputs(input_dict, i)
  File "E:\ComfyUI\ComfyUI\execution.py", line 291, in process_inputs
    result = f(**inputs)
             ^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1538, in sample
    return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\nodes.py", line 1505, in common_ksampler
    samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sample.py", line 60, in sample
    samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1163, in sample
    return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1053, in sample
    return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 1035, in sample
    output = executor.execute(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed, latent_shapes=latent_shapes)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\samplers.py", line 984, in outer_sample
    self.inner_model, self.conds, self.loaded_models = comfy.sampler_helpers.prepare_sampling(self.model_patcher, noise.shape, self.conds, self.model_options)
                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 130, in prepare_sampling
    return executor.execute(model, noise_shape, conds, model_options=model_options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\patcher_extension.py", line 112, in execute
    return self.original(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\sampler_helpers.py", line 138, in _prepare_sampling
    comfy.model_management.load_models_gpu([model] + models, memory_required=memory_required + inference_memory, minimum_memory_required=minimum_memory_required + inference_memory)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 701, in load_models_gpu
    loaded_model.model_load(lowvram_model_memory, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 506, in model_load
    self.model_use_more_vram(use_more_vram, force_patch_weights=force_patch_weights)
  File "E:\ComfyUI\ComfyUI\comfy\model_management.py", line 536, in model_use_more_vram
    return self.model.partially_load(self.device, extra_memory, force_patch_weights=force_patch_weights)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 968, in partially_load
    raise e
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 965, in partially_load
    self.load(device_to, lowvram_model_memory=current_used + extra_memory, force_patch_weights=force_patch_weights, full_load=full_load)
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 681, in load
    loading = self._load_list()
              ^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 668, in _load_list
    module_offload_mem += low_vram_patch_estimate_vram(self.model, weight_key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "E:\ComfyUI\ComfyUI\comfy\model_patcher.py", line 142, in low_vram_patch_estimate_vram
    return weight.numel() * model_dtype.itemsize * LOWVRAM_PATCH_ESTIMATE_MATH_FACTOR
                            ^^^^^^^^^^^^^^^^^^^^
AttributeError: 'NoneType' object has no attribute 'itemsize'

2025-12-08T23:08:43.488460 - Prompt executed in 7.85 seconds

if weight is None:
return 0
return weight.numel() * torch.float32.itemsize * LOWVRAM_PATCH_ESTIMATE_MATH_FACTOR
model_dtype = getattr(model, "manual_cast_dtype", torch.float32)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

manual_cast_dtype can be None, this code needs to either exclude that case or have an or torch.float32 on the end (getattr default only fires if the attr doesn't exist, not if it's None, so this PR is erroring for users)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants