Skip to content
Snippets Groups Projects
Salim Fares's avatar
Salim Fares authored
e39ec54e

Utilising Foundation Models as Active Annotators for Semantic Segmentation

conda create -n myenv python=3.9.18
conda activate myenv
git clone https://github.com/SalimFares4/Active-Learning-Segmentation.git
cd Active-Learning-Segmentation
pip install -r requirements.txt
python -m ipykernel install --user --name myenv --display-name "Python (myenv)"

If you want to re-produce similar results to our documented ones. Follow the steps (The scripts will create all directories and files required, no need to create anything manually):

  1. Use download_data.ipynb to download the required datasets.

  2. Use 01-Data-preparation.ipynb to pre-process the downloaded data. Uncomment the section related to the desired dataset. The script will process the raw data and copy it to differet directory called processed.

  3. Use 02-Active_Learning_Pipeline to train with active learning. The first cell in the notebook, reads the parameters files related to the desired dataset. Uncomment the one you want, and comment the rest. At the end of the notebook, predefined settings are provided, each set of settings corresponds to an approach we proposed in the thesis. Modify the parameters file according to the pre-defined settings to apply the desired approach. (Make sure that "knowledge_distillation" in the parameters file is set to false).

At the end of the training cell, all generated masks and saved model's states are deleted.

  1. Use 03-Knowledge_Distillation_Preparation To generate and save the logits from SAM, with a DataFrame holding the paths.

  2. Use 04-Knowledge_Distillation_Training To train with knowledge distillation and save the logits from SAM, with a DataFrame holding the paths. (Make sure that "knowledge_distillation" in the parameters file is set to true).

No model's state is saved.

The other files are used for small script experiments or visualization, feel free to explore and try them.