This repository serves as a central hub for the code and resources following a series of deep learning tutorials I write on my blog.
The goal is to provide practical, hands-on examples to solidify the understanding of core deep learning concepts.
This project is born from the idea of creating a well-organized and ever-growing collection of deep learning tutorials which I write in my personal blog and put the codes here.
Each tutorial is designed to be a deep dive into a specific topic, breaking down complex stuff into understandable and applicable code.
To get a local copy up and running, follow these simple steps.
Ensure you have Python 3.x installed on your system. You can download it from python.org.
-
Clone the repo:
git clone [https://github.com/your_username/your_repository.git](https://github.com/your_username/your_repository.git)
-
Navigate to the project directory:
cd your_repository -
Install the required packages:
pip install -r requirements.txt
-
Please note that since I'm writing tutorials on different things some codes might require different libraries. Therefore, the
requirements.txtmight not be up to date. Sorry about that.
Here you'll find a curated list of in-depth tutorials.
It's essential to understand, or at least have some knowledge of, the convolution operation's workings and nature to comprehend Convolutional Neural Networks (CNNs). Convolution itself is a core mathematical operation, integral to various domains including signal processing, image processing, and particularly deep learning. The true power of the convolution operation lies in its ability to offer a robust means of observing and characterizing physical systems. Let's examine the mechanics of this operation!
- Blog Post: The Convolution Operation
- Code:
Graph Neural Networks (GNNs) are a class of deep learning methods designed to perform inference on data described by graphs. This tutorial will guide you through the fundamentals of GNNs and introduce you to
Spektral, a Python library for building graph neural networks with TensorFlow and Keras.
- Blog Post: I want to read it!
- Code: gnn_spektral_intro.ipynb
While frameworks like TensorFlow and PyTorch have revolutionized AI development, they often come with a certain rigidity and verbosity. Enter JAX, Google's high-performance numerical computing library that's shaking things up. At least, according to the experts! JAX is a toolkit built on NumPy, designed for high-performance numerical computation and machine learning research.
- Blog Post: I want to read it!
- Code: jax_101.ipynb
So, why are we writing about Flax if the title promises JAX? From their original documentation: Flax provides a flexible end-to-end user experience for researchers and developers who use JAX for neural networks. Flax enables you to use the full power of JAX.
- Blog Post: I want to read it!
- Code: flax_101.ipynb
Let's implement a foundational Graph Neural Network layer based purely on the message passing paradigm. This will be a more general "GNN layer" than a specific GCN, allowing us to understand the core mechanics without the extra complexity of convolutional normalization.
- Blog Post: I want to read it!
- Code: gnns.ipynb
Contributions are what make the open-source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement".
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
Distributed under the MIT License. See LICENSE for more information.
Igor Azevedo - @igorlrazevedo - igorlima1740@gmail.com