Releases
1.0.0
Compare
Sorry, something went wrong.
No results found
Version 1.0.0
Major Underlying Changes
Integration of Ibex with immApex
Updated Seurat object to v5
Runs using basilisk instead of reticulate - no installation of python packages
Feature Changes
Converted Ibex.matrix() to Ibex_matrix()
Updated support for SCE format for runIbex()
Update CoNGAfy() to function with all versions of Seurat
Updated quietBCRgenes() to use VariableFeatures() call for SeuratV5 and backward compatibility.
Add getHumanIgPseudoGenes() to return a list of human Immunoglobulin Pseudo genes that are kept by quietBCRgenes()
New Models
Added New Light and Heavy Chain Models
Encoding methods now accepted: "OHE", "atchleyFactors", "crucianiProperties", "kideraFactors", "MSWHIM","tScales", "zScales"
Sequence input:
Human Heavy: 10000000
Human Light: 5000000
Human Heavy-Expanded: 5000000
Human Light-Expanded: 2500000
Mouse Heavy: 5000000
Mouse Heavy-Expanded: 5000000
Trained convolutional and variational autoencoders for Heavy/Light chains
Architecture: 512-256-128-256-512
Parameters:
Batch Size = 128
Latent Dimensions = 128
Epochs = 100
Loss = Mean Squared Error (CNN) & KL Divergence (VAE)
Activation = relu
Learning rate = 1e-6
Optimizers: Adam
Early stopping was set to patients of 10 for minimal validation loss and restoration of best weights
CNN autoencoders have batch normalization layers between the dense layers.
You can’t perform that action at this time.