This C++ program demonstrates a basic implementation of a perceptron trained using the loss function and the perceptron trick. The perceptron is a fundamental building block of neural networks and is capable of binary classification.
- C++ Compiler
-
Clone the repository:
git clone https://github.com/your-username/perceptron-cpp.git
-
Compile the C++ program:
g++ perceptron.cpp -o perceptron
-
Run the executable:
./perceptron
The program expects two CSV files:
train.csv
- Training data containing input features and corresponding binary labels.test.csv
- Testing data for evaluating the trained perceptron.
split_nums
: Function to split a string into a vector of doubles.printCols
: Function to print input features and labels.sum
: Function to calculate the sum of two vectors, including a bias term.classifier
: Function to classify input using the trained perceptron.
The program performs training using the perceptron trick and then tests the perceptron on a separate dataset.
- Learning Rate: 0.1
- Epochs: 1000
- This is a very basic single neuron perceptron with random weights. The sample data given is also very small (~100 rows). So the current outputs may vary
a lot
. Thus, it is for learning purposes only.
Feel free to contribute by opening issues or submitting pull requests.
This project is licensed under the MIT License - see the LICENSE.md file for details.
- Inspired by the concept of perceptrons and neural networks.