Skip to content

Add new deep learning functionality#78

Open
ncguilbeault wants to merge 30 commits intobonsai-rx:mainfrom
ncguilbeault:dev/torch-neuralnets-additions
Open

Add new deep learning functionality#78
ncguilbeault wants to merge 30 commits intobonsai-rx:mainfrom
ncguilbeault:dev/torch-neuralnets-additions

Conversation

@ncguilbeault
Copy link
Copy Markdown
Collaborator

This PR adds new functionality into the Bonsai.ML.Torch package to allow creation of deep learning modules directly in Bonsai. Operators were added to the Bonsai.ML.Torch.NeuralNets namespace to support a wide range of deep learning functionality, including allowing the direct creation of torch modules (convolutional layers, pooling layers, padding layers, recurrent neural networks, transformers, etc), and existing operators were refactored to support this new functionality in addition to previous functionality.

The PR in #77 should be merged first before this one.

@ncguilbeault ncguilbeault force-pushed the dev/torch-neuralnets-additions branch 2 times, most recently from 2df9d1a to e9ad97a Compare December 16, 2025 11:24
@ncguilbeault ncguilbeault force-pushed the dev/torch-neuralnets-additions branch 2 times, most recently from d4c9318 to 5369308 Compare December 19, 2025 14:01
@ncguilbeault ncguilbeault force-pushed the dev/torch-neuralnets-additions branch from 5369308 to a4400fc Compare January 13, 2026 11:39
…rrectly display element name and properties of module
@ncguilbeault ncguilbeault force-pushed the dev/torch-neuralnets-additions branch from a4400fc to e9f4d09 Compare January 19, 2026 14:57
@ncguilbeault ncguilbeault requested a review from glopesdev March 11, 2026 10:43
@glopesdev glopesdev changed the title Addition of new deep learning functionality Add new deep learning functionality Mar 12, 2026
Copy link
Copy Markdown
Member

@glopesdev glopesdev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Massive effort, thanks for putting this all together! Lots of comments below mainly on naming and organization for us to think about.

/// See <see href="https://pytorch.org/docs/stable/generated/torch.nn.CELU.html"/> for more information.
/// </remarks>
[Description("Creates a continuously differentiable exponential linear unit (CELU) activation function.")]
[DisplayName("CELU")]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have on occasion changed the display name for dynamic operators, but usually I try to avoid doing it for casing reasons, since on XML the actual class name will persist, and may create confusion between diffs and the visual editor.

I do agree this is a strange acronym, I looked elsewhere and couldn't find a better name, but would still keep the C# naming conventions.

/// See <see href="https://pytorch.org/docs/stable/generated/torch.nn.ELU.html"/> for more information.
/// </remarks>
[Description("Creates an exponential linear unit (ELU) activation function.")]
[DisplayName("ELU")]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as above.

/// See <see href="https://pytorch.org/docs/stable/generated/torch.nn.GELU.html"/> for more information.
/// </remarks>
[Description("Creates a gaussian error linear unit (GELU) activation function.")]
[DisplayName("GELU")]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above.

/// </remarks>
[Description("Creates a gated linear unit (GLU) module.")]
[DisplayName("GLU")]
public class Glu
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above. Alternatively, we could emulate our decision in the Torch package and simply expand all acronyms into their full long names to favor readability, as these acronyms are impenetrable to cursory reading, and easily confused with each other.

/// See <see href="https://pytorch.org/docs/stable/generated/torch.nn.LeakyReLU.html"/> for more information.
/// </remarks>
[Description("Creates a leaky rectified linear unit (LeakyReLU) activation function.")]
[DisplayName("LeakyReLU")]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can see that the casing in torch is all over the place, so maybe we just default to expanding long-names and make everything clearer?

/// <summary>
/// Represents an operator that saves a module's state to a file.
/// </summary>
[Combinator]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above.

/// </summary>
[XmlInclude(typeof(Shuffle.ChannelShuffle))]
[DefaultProperty(nameof(ShuffleModule))]
[Combinator]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above.

[XmlInclude(typeof(Sparse.EmbeddingBagFromPretrained))]
[XmlInclude(typeof(Sparse.EmbeddingFromPretrained))]
[DefaultProperty(nameof(SparseModule))]
[Combinator]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above.

[XmlInclude(typeof(Transformer.TransformerEncoder))]
[XmlInclude(typeof(Transformer.TransformerEncoderLayer))]
[DefaultProperty(nameof(TransformerModule))]
[Combinator]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above.

[XmlInclude(typeof(Vision.PixelUnshuffle))]
[XmlInclude(typeof(Vision.Upsample))]
[DefaultProperty(nameof(VisionModule))]
[Combinator]
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as above.

@glopesdev glopesdev added this to the v0.5.0 milestone Mar 12, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants