Security News
Oracle Drags Its Feet in the JavaScript Trademark Dispute
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Melpy is a package made to learn machine learning and make its uses as simple as possible.
The project started in 2022 when I was still in high school. While taking an online machine learning course, I got frustrated with not fully understanding how the algorithms worked. To solve this, I decided to implement them myself to gain a deeper and clearer understanding.
What started as a simple Python script has become Melpy, a deep learning library built entirely from scratch using NumPy. Melpy is inspired by the best tools available and makes it easy to create and train models like FNNs and CNNs. It also includes tools for data preprocessing and data visualization, making it a complete solution for deep learning.
Melpy requires an up-to-date python environment. I recommend conda, which is dedicated to the scientific use of python.
All other dependencies will be installed automatically during the library installation process.
Melpy is available on PyPI as melpy. Run the following command to install it in your environment:
pip3 install melpy --upgrade
To demonstrate Melpy’s capabilities, let’s work through a mini-project together. We will classify the Iris dataset, a classic example in machine learning. The dataset contains three classes: Setosa, Versicolor, and Virginica, described by the following features: Sepal Length, Sepal Width, Petal Length, and Petal Width.
First, let’s load the data and split it into training and test sets:
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
iris_dataset = load_iris()
X_train, X_test, y_train, y_test = train_test_split(
iris_dataset['data'], iris_dataset['target'], test_size=0.25, random_state=0)
Next, visualize the data to identify any patterns:
import matplotlib.pyplot as plt
plt.figure()
plt.scatter(X_train[:,0], X_train[:,1], c=y_train, alpha=0.3, cmap="coolwarm")
plt.show()
Figure 1
As we can see in Figure 1, there is a clear correlation between species and features like Sepal Length and Sepal Width.
FNNs require input data to be scaled close to zero. It is why we are now going to use StandardScaler from melpy.preprocessing :
The Standard Scaler is a pre-processing technique which consists of removing the mean from a data set and dividing by its variance. You can find out more about data scaling here: Feature Scaling.
from melpy.preprocessing import StandardScaler
sc = StandardScaler()
y_train = sc.transform(y_train) # Scales data
y_test = sc.transform(y_test) # Scaled with the same mean and variance than X_train
Next, we encode the target labels using OneHotEncoder, also from melpy.preprocessing:
One-hot encoding is a method of representing categorical data as binary vectors. Each unique category is assigned a unique vector where one element is set to 1 (hot) and all others are 0. You can find out more about data encoding here : One-hot.
from melpy.preprocessing import OneHotEncoder
ohe = OneHotEncoder()
X_train = ohe.transform(X_train) # Encodes data
X_test = ohe.transform(X_test) # Encodes with the same encoding than y_train
We’re tackling a multi-class classification problem using tabular data, which requires:
Now, let’s build the model using Melpy’s Sequential class:
Sequential models are neural networks where layers are stacked in a linear order. Data flows through them one by one in sequence.
import melpy.NeuralNetworks as nn
model = nn.Sequential(X_train, y_train, X_test, y_test)
model.add(nn.Dense(X_train.shape[1], 6), nn.ReLU())
model.add(nn.Dense(6, y_train.shape[1]), nn.Softmax())
model.compile(cost_function=nn.CategoricalCrossEntropy(), optimizer=nn.SGD(learning_rate=0.01))
We define:
These functions together form what we call an architecture. If you’re new to deep learning, I highly recommend 3Blue1Brown's excellent video series on the topic. It provides a clear explanation of how and why these functions are used.
We can view the model structure with:
model.summary()
Dense: (1, 6)
ReLU: (1, 6)
Dense: (1, 3)
Softmax: (1, 3)
Finally, we train the model with 5000 epochs and observe the results with verbose and LiveMetrics :
model.fit(epochs=5000, verbose = 1, callbacks=[nn.LiveMetrics()])
model.results()
Figure 2
Epoch [5000/5000]: 100%|██████████| 5000/5000 [00:03<00:00, 1543.94it/s, loss=0.0389, accuracy=0.988]
-------------------------------------------------------------------
| [TRAINING METRICS] train_loss: 0.03893 · train_accuracy: 0.9881 |
-------------------------------------------------------------------
| [VALIDATION METRICS] val_loss: 0.06848 · val_accuracy: 0.98246 |
-------------------------------------------------------------------
Our model achieves 98% accuracy on both training and test datasets, which is good! With further optimization you could potentially reach 100%. Feel free to experiment!
If you look closely, you will notice that the plot on the right closely resembles Figure 1. It’s actually the model’s inputs colored by the outputs, allowing us to visually assess whether the model is well trained.
Save your trained parameters and metrics for future use:
model.save_params("iris_parameters")
model.save_histories("iris_metrics")
You can reload the parameters with load_params(path) and the metrics using the pickle library.
For more examples, please refer to the Documentation
I plan to speed up computations using Numba or JAX and to implement additional deep learning architectures, as well as more traditional machine learning algorithms.
See the open issues for a full list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
git checkout -b feature/AmazingFeature
)git commit -m 'Add some AmazingFeature'
)git push origin feature/AmazingFeature
)Distributed under the MIT License. See LICENSE.txt
for more information.
Lenny Malard - lennymalard@gmail.com or linkedin
Project Link: https://github.com/lennymalard/melpy-project
FAQs
Melpy is a package made to learn machine learning and make its uses as simple as possible.
We found that melpy demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Oracle seeks to dismiss fraud claims in the JavaScript trademark dispute, delaying the case and avoiding questions about its right to the name.
Security News
The Linux Foundation is warning open source developers that compliance with global sanctions is mandatory, highlighting legal risks and restrictions on contributions.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.