In this tutorial, I will guide you to use google colab for fast.ai lessons.
Google colab is a tool which provides free GPU machine continuously for 12 hours. Even you can reconnect to a different GPU machine after 12 hours.
Here are the simple steps for running fast.ai Notebooks on google colab.
- Download fast.ai lesson notebooks. from https://github.com/fastai/fastai/tree/master/courses/dl1
- Login to your Google(Gmail) account in a browser.
- Go to colaboratory at URL https://research.google.com/colaboratory/unregistered.html
- A pop-up window will appear, close that window.
- Upload a new Notebook from your downloaded notebook files on colab (go to File ->Upload Notebook ) whichever lesson you want to work on.
- Now change your runtime machine to GPU machine and choose the type of python (Python 2 or Python 3) you are going to use by clicking on (runtime -> change runtime type).
- You can check if GPU running or not by writing the following code:
12import tensorflow as tftf.test.gpu_device_name()
it should come up with output ‘/device:GPU:0′
Now install the following libraries in your notebook by inserting code cells( Insert -> code cell ):
- Install pytorch using
1!pip install http://download.pytorch.org/whl/cu80/torch-0.3.0.post4-cp36-cp36m-linux_x86_64.whl && pip install torchvision - Install fast.ai using
1!pip install fastai - Install libSM using
1!apt update && apt install -y libsm6 libxext6
Download dataset using bash commands as an example of dogs vs cats dataset
1 |
!mkdir data && wget http://files.fast.ai/data/dogscats.zip && unzip dogscats.zip -d data/ |
Now you are ready to use fast.ai on google colab.
Enjoy!!!
nice work..
It works! 🙂 Thanks.
Actually there are issues. I find that apparently google scrubs your directory after you close your .ipynb file so the dogscats data has to be re-downloaded every session; same with loading in torch, fast.ai, … although I get messages indicating those libraries have already been installed, the notebook fails without those series of commands in order.
Also, I often hit a code block where it gives an error complaining about the Javascript widget still loading and ‘Failed to display Jupyter Widget of type HBox’. Not long after running the model, the environment gave me a running out of memory warning.
And now around the Fine-tuning and differential learning rate annealing section, I’m getting an error telling me np is not defined anymore. Hmmm, seems like I have to do this on Paperspace or something after all 🙁
Once you are logged in to google colab with your google account and assigned a GPU machine, you can access same machine continuously for 12 hours, using same account.So you need not to run same command again and again.
If you get any error regarding “running out of memory” please ignore it and proceed further since it is having 13 GB of RAM. So, it will work for lesson 1 but for other computer vision problem you will face out of memory error. So, better to switch to paperspace or something else.
But since it is free of cost and having 13 GB GPU machine you can run some other code which is much better than running on CPU only.
Do you consistently get 100% GPU? I get 5% of it 99% of the times I connected to it. I have only seen 100% GPU RAM about 2 times over many many attempts. I have just retested it – getting only 5% GPU RAM.
Please see: https://stackoverflow.com/questions/48750199/google-colaboratory-misleading-information-about-its-gpu-only-5-ram-available
Perhaps they do a different allocation depending on where you connect from? I connect from Canada.
Based on the comments to my post on stackoverflow it seems to be an issue for many users.
Do you connect from US?
Thank you.
I am connecting from India.
I am facing the same issue of not getting 100% of GPU memory all the time but I have completed lesson 1 of fast.ai dl1 without any memory issue.
Still figuring out how GPU memory is being distributed by Google to users. Will update soon.
Thanks!
This is the final answer I’ve been looking for!! thank you.
This thing works! thanks!
Thank You! You are a Star!!!!!!!!!!!!!!!