Installation
Prerequisities
Ensure conda
is installed on your system. You can install Miniconda or Anaconda:
- Miniconda (recommended): https://docs.conda.io/en/latest/miniconda.html
- Anaconda: https://www.anaconda.com/products/distribution
After installing conda
, ensure it is available in your PATH by running. You may require to restart the terminal session:
Before installing the unlearn_diff package, follow these steps to set up your environment correctly. These instructions ensure compatibility with the required dependencies, including Python, PyTorch, and ONNX Runtime.
Step-by-Step Setup:
Step 1: Create a Conda Environment Create a new Conda environment named myenv with Python 3.8.5:
conda create -n myenv python=3.8.5
Step 2: Activate the Environment Activate the environment to work within it:
conda activate myenv
Step 3: Install Core Dependencies Install PyTorch, torchvision, CUDA Toolkit, and ONNX Runtime with specific versions:
conda install pytorch==1.11.0 torchvision==0.12.0 cudatoolkit=11.3 onnxruntime==1.16.3 -c pytorch -c conda-forge
Step 4: Install our unlearn_diff Package using pip:
pip install unlearn_diff
Step 5: Install Additional Git Dependencies:
After installing unlearn_diff, install the following Git-based dependencies in the same Conda environment to ensure full functionality:
pip install git+https://github.com/CompVis/taming-transformers.git@master git+https://github.com/openai/CLIP.git@main git+https://github.com/crowsonkb/k-diffusion.git git+https://github.com/cocodataset/panopticapi.git git+https://github.com/Phoveran/fastargs.git@main git+https://github.com/boomb0om/text2image-benchmark
Downloading data and models.
After you install the package, you can use the following commands to download.
-
Dataset:
-
unlearn_canvas:
- Sample:
download_data sample unlearn_canvas
- Full:
download_data full unlearn_canvas
- Sample:
-
i2p:
- Sample:
download_data sample i2p
- Full:
download_data full i2p
- Sample:
-
-
Model:
- compvis:
download_model compvis
- diffuser:
download_model diffuser
- compvis:
-
Download best.onnx model
download_best_onnx
-
Download coco dataset
download_coco_dataset