Book a Demo!
CoCalc Logo Icon
StoreFeaturesDocsShareSupportNewsAboutPoliciesSign UpSign In
ninjaneural
GitHub Repository: ninjaneural/webui
Path: blob/master/deforum/dark_sushi_25d_25d_webui_colab.ipynb
3275 views
Kernel: Python 3
#@markdown ## 구글드라이브 연동 #@markdown **생성한 이미지를 구글드라이브에 자동저장하려면 체크해주세요** Google_Drive = False #@param {type:"boolean"} #@markdown **초기모델(checkpoint)을 구글드라이브에 저장하려면 체크해주세요** #@markdown <div><font color="red">모델 파일사이즈(2GB ~ 7GB)가 크니 구글드라이브 용량을 확인해주세요</div> #@markdown <div><font color="red">대신 한번받으면 다음 실행부터는 빨라져요</div> Checkpoint_Google_Save = False #@param {type:"boolean"} #@markdown ----- #@markdown *터널링* #@markdown **Ngrok** Ngrok_Key = '' #@param {type:"string"} #@markdown **Localtunnel** Localtunnel = False #@param {type:"boolean"} #@markdown ----- #@markdown *아래는 설정할 필요 없어요* #@markdown **구글드라이브에 연결 폴더** Google_Drive_Dir = 'webui' #@param {type:"string"} #@markdown **폴더 설명** : 미리 만드셔도 되고 없으면 자동으로 생성해요 #@markdown * webui/output : 생성된 이미지들이 저장되요 #@markdown * webui/checkpoint : 모델(checkpoint)를 넣어주면 읽어올수 있어요 #@markdown * webui/lora : 로라(LoRA)를 넣어주면 읽어올수 있어요 #@markdown * webui/embedding : 임베딩(Textual Inversion)를 넣어주면 읽어올수 있어요 #@markdown * webui/hyperwork : 하이퍼워크(Hyperworks)를 넣어주면 읽어올수 있어요 #@markdown * webui/wildcards : 와일드카드(Wildcards)를 넣어주면 읽어올수 있어요 #@markdown **초기모델(checkpoint) URL** Checkpoint_Url = 'https://civitai.com/api/download/models/141866?type=Model&format=SafeTensor&size=pruned&fp=fp16' #@param {type:"string"} #@markdown **초기모델 파일명** Checkpoint_Filename = 'dark_sushi_25d_25d.safetensors' #@param {type:"string"} #@markdown ----- #@markdown *추가익스텐션* #@markdown **ControlNet을 사용하지 않으면 체크를 해제해주세요** ControlNet = True #@param {type:"boolean"} Workspace = 'ui' NotebookVersion = 'deforum' if Google_Drive: from google.colab import drive drive.mount('/content/drive') Checkpoint_SavePath = f'/content/{Workspace}/models/Stable-diffusion' if Google_Drive and Checkpoint_Google_Save: Checkpoint_SavePath = f'/content/{Workspace}/models/Stable-diffusion/google' !apt -y install -qq aria2 %cd /content !wget https://raw.githubusercontent.com/neuralninja22/colab/master/misc/install_{NotebookVersion}.sh -O /content/install.sh !bash /content/install.sh {Workspace} {ControlNet} # 구글드라이브 연결 !wget https://raw.githubusercontent.com/neuralninja22/colab/master/misc/link_google_drive.sh -O /content/link_google_drive.sh !bash /content/link_google_drive.sh {Workspace} {Google_Drive} {Google_Drive_Dir} # 커스터마이징 %cd /content/{Workspace} !bash /content/drive/MyDrive/{Google_Drive_Dir}/{NotebookVersion}/install.sh {Workspace} if Google_Drive: !cp -f /content/drive/MyDrive/{Google_Drive_Dir}/{NotebookVersion}/config.json /content/{Workspace}/config.json !cp -f /content/drive/MyDrive/{Google_Drive_Dir}/{NotebookVersion}/ui-config.json /content/{Workspace}/ui-config.json !cp -f /content/drive/MyDrive/{Google_Drive_Dir}/{NotebookVersion}/styles.csv /content/{Workspace}/styles.csv # checkpoint select !sed -i -e 's/"sd_model_checkpoint": "",/"sd_model_checkpoint": "{Checkpoint_Filename}",/g' /content/{Workspace}/config.json # checkpoint !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M "{Checkpoint_Url}" -d {Checkpoint_SavePath} -o {Checkpoint_Filename} # vae !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M https://huggingface.co/ckpt/sd-vae-ft-mse-original/resolve/main/vae-ft-mse-840000-ema-pruned.ckpt -d {Checkpoint_SavePath} -o vae-ft-mse-840000-ema-pruned.vae.pt !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M https://huggingface.co/hakurei/waifu-diffusion-v1-4/resolve/main/vae/kl-f8-anime2.ckpt -d /content/{Workspace}/models/VAE -o kl-f8-anime2.ckpt # lora !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M "https://civitai.com/api/download/models/62833?type=Model&format=SafeTensor" -d /content/{Workspace}/models/Lora -o add_detail.safetensors !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M "https://civitai.com/api/download/models/63006?type=Model&format=SafeTensor" -d /content/{Workspace}/models/Lora -o LowRA.safetensors !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M "https://civitai.com/api/download/models/32988?type=Model&format=SafeTensor&size=full&fp=fp16" -d /content/{Workspace}/models/Lora -o blindbox_V1Mix.safetensors # embeddings !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M https://huggingface.co/datasets/gsdf/EasyNegative/resolve/main/EasyNegative.pt -d /content/{Workspace}/embeddings -o EasyNegative.pt !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M https://huggingface.co/AsciiP/badhandv4/resolve/main/badhandv4.pt -d /content/{Workspace}/embeddings -o badhandv4.pt !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M https://huggingface.co/yesyeahvh/bad-hands-5/resolve/main/bad-hands-5.pt -d /content/{Workspace}/embeddings -o bad-hands-5.pt !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M "https://civitai.com/api/download/models/60938?type=Negative&format=Other" -d /content/{Workspace}/embeddings/ -o negative_hand.pt !aria2c --console-log-level=error -c -x 16 -s 16 -k 1M "https://civitai.com/api/download/models/60095?type=Negative&format=Other" -d /content/{Workspace}/embeddings/ -o bad_prompt_version2.pt if Localtunnel: !npm install -g localtunnel import subprocess import threading import time import socket import urllib.request def iframe_thread(port): while True: time.sleep(0.5) sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM) result = sock.connect_ex(('127.0.0.1', port)) if result == 0: break sock.close() print("\nComfyUI finished loading, trying to launch localtunnel (if it gets stuck here localtunnel is having issues)\n") print("The password/enpoint ip for localtunnel is:", urllib.request.urlopen('https://ipv4.icanhazip.com').read().decode('utf8').strip("\n")) p = subprocess.Popen(["lt", "--port", "{}".format(port)], stdout=subprocess.PIPE) for line in p.stdout: print(line.decode(), end='') threading.Thread(target=iframe_thread, daemon=True, args=(7860,)).start() if Ngrok_Key: !python launch.py --xformers --no-half-vae --theme dark --ngrok {Ngrok_Key} else: !python launch.py --share --xformers --no-half-vae --theme dark --gradio-queue