Skip to content

Analog AI Neural Architecture Search (analog-nas) is a modular and flexible framework to facilitate implementation of Analog-aware Neural Architecture Search.

License

Notifications You must be signed in to change notification settings

vishgoki/nas-seg-net

 
 

Repository files navigation

NAS-SegNet

Main Code (THIS IS NOT THE MAIN CODE BRANCH) Main Project Code Branch

Description

NAS-SegNet Goal of this project is to implement NASSegNet for nuclei image segmentation with analog/hardware-based neural architecture search (HW-NAS). Until now, only classification architectures have been implemented in this domain, so the unique value of this solution is the new segmentation implementation.

Main Code (THIS IS NOT THE MAIN CODE BRANCH) Main Project Code Branch

Code Updates:

NAS Run |
Digital Training |
Analog Training

Approach

  • Use the MONAI (Medical Open Network for AI) framework for dataset pre-processing and augmentation.
  • Adapt the existing IBM Analog-NAS Macro-Architecture which performs image classification by default to a novel Macro-Architecture which is utilized to run a neural architecture search for generating an optimized NASSegNet model architecture for nuclei segmentation.
  • The Analog-NAS approach explores different neural network configurations, evaluating their performance on the target task and hardware constraints, to find the most efficient architecture. This is done by using the pretrained surrogate models.
  • We then train the model architecture with the best accuracy using digital and analog methods.

Results

  • Successfully implemented the NASSegNet architecture for nuclei segmentation.
  • Leveraged the IBM Analog-NAS tool to perform a neural architecture search, resulting in an optimized NASSegNet model with best accuracy for this task.
  • Trained the optimal network generated model (digital and analog training) with the best model using IBM AIHWKIT. image
image

Technical Challenges

AnalogaiNAS package offers the following features:

  • Utilizing BootstrapNAS by Intel to create a search space and macro architecture for segmentation using UNet Architecture – This was tightly coupled with Intel’s NNCF library and doesn’t support integration with AnalogNAS.
  • AnalogNAS and its Image classification dependency: AnalogNAS’s search space, macro architecture are suited for image classification.
  • Implementing NASSegNet for nuclei segmentation poses significant technical challenges.

Dataset and Data Preparation

Nuclear segmentation in digital microscopic tissue images can enable extraction of high-quality features for nuclear morphometric and other analyses in computational pathology. However, conventional image processing techniques such as Otsu and watershed segmentation do not work effectively on challenging cases such as chromatin-sparse and crowded nuclei. In contrast, machine learning-based segmentation techniques are able to generalize over nuclear appearances.  Finally, data augmentation and preprocessing transforms are applied using train_transforms and val_transforms and supplied to the model via dataloaders.

Before: image After: image

Training the NASSegNet Model with Analog AI NAS

image We define the NASSegNet model architecture, specifying parameters like input/output channels and number of units, downsample, upsample/transpose conv layers.

image Set up the Dice Loss as the loss function and the Dice Metric as the evaluation metric for training and validating the model. ![image](https://github.com/vishgoki/nas-seg-net/assets/57005975/2d4e10be-7de0-46d3-be6f-282482ca8fb4)

Observations and Conclusion

  • Successfully implemented the NASSegNet architecture for nuclei segmentation. Generated architecture of model with best accuracy which was then trained digitally and also analog training was performed with aihwtoolkit.
  • Leveraged the IBM Analog-NAS tool to perform a neural architecture search. Also worked with AIHWToolkit for analog training.
  • This solution represents a implementation of NASSegNet for medical image segmentation, going beyond the previous use of ResNet-like architectures in this domain.

References

License

This project is licensed under Apache License 2.0.

About

Analog AI Neural Architecture Search (analog-nas) is a modular and flexible framework to facilitate implementation of Analog-aware Neural Architecture Search.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 98.7%
  • Python 1.3%