Releases: BerkeleyLab/fiats
Releases · BerkeleyLab/fiats
0.2.0 JSON file read/write capability, more accurate default inference strategy
0.1.2 Bug fix and documentation edits
0.1.1
What's Changed
- Prepare open-source release by @rouson in #18
- Add LICENSE.txt file with copyright notice and license agreement
- Add statement referring to the license at the top of each source file
- Add build instructions to the README.md
- Add a basic
ford
project file - Set up the CI to post the
ford
documentation to GitHub Pages
- Add asymmetric network test by @rouson in #19
- The new test uses a network that encodes a two-input/one-output digital logic circuit that performs operations equivalent to "XOR AND input-2".
- Fix asymmetric-network test of
matmul
-based inference by @rouson in #20- address an issue that the new asymmetric test exposed.
Full Changelog: 0.1.0...0.1.1
Initial Release: Concurrent Inference Capability
This release provides an inference_engine_t
type that encapsulates the state and behavior of a dense neural network with
- State:
a. Weights and biases biases gathered into contiguous, multidimensional arrays, 🏋️
b. Hidden layers with a uniform number of neurons. 🧠 - Behavior:
a. Apure
infer
type-bound procedure that propagates input through the above architecture to produce output. ❄️
b. Anelemental
interface for activation functions with one currently implemented: a step function. 🪜
c. Runtime selection of inference method via the Strategy Pattern:- concurrent execution of
dot_product
intrinsic function invocations or matmul
intrinsic function invocations.
d. Runtime selection of activation functions: currently only a step function is implemented to support the unit tests.
- concurrent execution of
- Unit tests that
a. Read and write neural networks to files, 📁
b. Construct a network that implements an exclusive-or (XOR) gate in the first hidden layer followed by a second layer with weights described by the identity matrix so that the second layer serves a pass-through role. 🚧 - Examples
a. Concurrent inference on multiple independent neural networks encapsulated ininference_engine_t
objects. 🤔
b. Construction and writing of a neural network to a file starting from user-defined weights and biases (useful for unit testing).
c. Reading a neural network from a file and querying it for basic properties.
d. Reading and writing from a NetCDF file (for future incorporation into the Inference-Engine library for purposes of reading validation data sets). 🥅
Because the infer
procedure is pure
, it can be called inside a do concurrent
construct, which facilitates concurrent inference using multiple, independent neural networks.
Full Changelog: https://github.com/BerkeleyLab/inference-engine/commits/0.1.0