Installation
Setting up Nevaarize on your system
Requirements
Before you begin, ensure you have the following installed:
- C++23 compatible compiler — GCC 13+ or Clang 16+
- Make — For building the project
- Git — For cloning the repository
- Linux x86-64 — Currently the primary supported platform
Nevaarize uses AVX2 SIMD instructions for maximum performance. Most CPUs from 2013 onwards support AVX2.
Clone the Repository
First, clone the Nevaarize repository:
git clone https://github.com/gtkrshnaaa/nevaarize.git
cd nevaarize
Build from Source
Nevaarize uses a simple Makefile for building. Run:
make
This will compile all source files and create the nevaarize binary
in the bin/ directory.
For a clean rebuild, use make clean && make
Verify Installation
Test that the build was successful:
./bin/nevaarize --version
You should see output like:
Nevaarize v0.2.3
Native JIT Compiler for the Nevaarize Programming Language
Built with C++23, Zero External Dependencies
Run Your First Program
Try running the hello world example:
./bin/nevaarize examples/basics.nva
Output (excerpt):
Integer: 42
Float: 3.14
...
add(7, 3): 10
...
=== All Features Demonstrated ===
Optional: Add to PATH
For convenience, add Nevaarize to your system PATH:
# Add to ~/.bashrc or ~/.zshrc
export PATH="$PATH:/path/to/nevaarize/bin"
# Reload your shell
source ~/.bashrc
Now you can run nevaarize from anywhere:
nevaarize script.nva
Project Structure
Here's an overview of the Nevaarize directory structure:
nevaarize/
├── bin/ # Compiled binary
├── build/ # Object files
├── core/ # Core compiler implementation
│ ├── include/ # Header files
│ ├── src/ # Source files
│ └── stdlib/ # Standard library (inside core)
├── examples/ # Example programs (64 verified)
│ ├── algorithm/ # Sorting, searching, math
│ ├── benchmarks/ # Performance benchmarks
│ ├── modules/ # Module system demos
│ ├── richcodesample/ # Advanced code samples
│ └── stdlibtest/ # Stdlib module tests
├── docs/ # Documentation (you are here)
└── Makefile # Build configuration
Model Commands
Nevaarize includes CLI commands for AI model training and inference:
Train a Model
Run a training script and save the resulting model:
./bin/nevaarize model train script.nva to model.nmod
The output path is resolved relative to the training script's location.
Run Model Inference
Load a trained model and run inference:
# View model info
./bin/nevaarize model run model.nmod
# Run with input data
./bin/nevaarize model run model.nmod --input "[1.0, 0.0, 1.0, 0.0]"
Model training and inference features are under active development.
See examples/modelInference.nva for a verified working demo.
Behavior for complex models outside the provided examples may vary.