Skip to content

Java library that implements several soft computing techniques and provides some case studies to show how the library can be used for real-world applications.

Notifications You must be signed in to change notification settings

Omar-Badwilan/SoftComputingLibrary

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

61 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Soft Computing Library

A Java soft-computing library that currently includes:

  • Genetic Algorithms (GA)
  • Fuzzy Logic (Mamdani + Sugeno)
  • Feed-forward Neural Networks (NN)

The repository also contains multiple case studies demonstrating how to use each module.


Quick Start

Build

mvn clean package

Run a demo

Only a few classes contain a public static void main(String[] args) entry point.

java --enable-preview -cp target/classes caseStudies.CaseStudyIRIS.IrisClassificationNN

See Case Studies / Examples for available demos and notes about datasets.


Requirements

  • JDK 24+ (the Maven compiler is configured with --enable-preview)
  • Maven

Build notes (preview features)

This project compiles with preview features enabled via Maven. If you run classes manually, you must also pass --enable-preview to java.

If you run from an IDE (IntelliJ / Eclipse), make sure:

  • Project SDK is set to a compatible JDK (24+).
  • VM options include --enable-preview for run configurations.

Modules

1. Genetic Algorithm (softcomputing.GeneticAlgorithm)

The GA module is built around a small set of core abstractions:

  • Chromosome<T>: represents a candidate solution (stores genes and a fitness).
  • FitnessFunction<T>: computes a fitness score for a chromosome.
  • SelectionStrategy<T>: chooses parents from the population.
  • Crossover<T>: combines parents to produce offspring.
  • MutationStrategy<T>: mutates offspring.
  • ReplacementStrategy<T>: decides how offspring replace the population.

Important: GeneticAlgorithm treats higher fitness as better (it selects the chromosome with the maximum fitness).

  • Chromosomes
    • BinaryChromosome
    • IntegerChromosome
    • FloatChromosome
  • Selection
    • TournamentSelection
    • RouletteWheelSelection
  • Crossover
    • SinglePointCrossover
    • TwoPointCrossover
    • UniformCrossover
  • Mutation
    • FlipMutation
    • SwapIntMutation
    • UniformFPMutation
  • Replacement
    • GenerationalReplacement
    • SteadyStateReplacement
    • ElitismReplacement
    • CutoffReplacement

Core entry point: softcomputing.GeneticAlgorithm.algorithm.GeneticAlgorithm.

GA: Minimal custom example

This is a minimal example that builds and runs a GA directly (not via case studies). It demonstrates the library’s intended composition style.

import softcomputing.GeneticAlgorithm.algorithm.*;
import softcomputing.GeneticAlgorithm.chromosome.*;
import softcomputing.GeneticAlgorithm.crossover.*;
import softcomputing.GeneticAlgorithm.mutation.*;
import softcomputing.GeneticAlgorithm.replacement.*;
import softcomputing.GeneticAlgorithm.selection.*;

public class SimpleGA {
    public static void main(String[] args) {
        IntegerChromosome chromosome = new IntegerChromosome(10, 0, 5);

        FitnessFunction<Integer> fitness = c -> {
            double sum = 0;
            for (Integer g : c.getGenes()) sum += g;
            return sum;
        };

        GeneticAlgorithm<Integer> ga = new GeneticAlgorithm<>(
            chromosome,
            fitness,
            new TournamentSelection<>(3),
            new UniformCrossover<>(),
            new SwapIntMutation(),
            new ElitismReplacement<>(5),
            50,     // populationSize
            50,     // maxGenerations
            0.7,    // crossoverRate
            0.1     // mutationRate
        );

        Chromosome<Integer> best = ga.run();
        System.out.println("Best: " + best);
    }
}

2. Fuzzy Logic (softcomputing.FuzzyLogic)

The fuzzy module supports two inference styles:

  • Mamdani: fuzzy consequents + defuzzification.
  • Sugeno: numeric consequents (e.g., constants/linear functions), typically no defuzzifier.

Core modeling components:

  • LinguisticVariable: a variable with a numeric range (e.g., Temperature [0..100]).

  • FuzzySet: a named membership function for a linguistic variable.

  • Rule: IF antecedents THEN consequents (supports AND/OR via RuleOperator).

  • RuleBase: collection of rules.

  • FuzzyLogic: the system wrapper; call infer(...) with crisp inputs.

  • Inference

    • Mamdani: softcomputing.FuzzyLogic.inference.MamdaniEngine
    • Sugeno: softcomputing.FuzzyLogic.inference.SugenoEngine
  • Membership functions

    • TriangularMF, TrapezoidalMF, GaussianMF
  • Defuzzification (Mamdani)

    • CentroidDefuzzifier, MOMDefuzzifier, WeightedAvgDefuzzifier

Core entry point: softcomputing.FuzzyLogic.algorithm.FuzzyLogic.

Fuzzy (Mamdani): minimal example

import softcomputing.FuzzyLogic.MF.TriangularMF;
import softcomputing.FuzzyLogic.algorithm.FuzzyLogic;
import softcomputing.FuzzyLogic.components.*;

import java.util.Collections;
import java.util.HashMap;
import java.util.Map;

public class SimpleMamdani {
    public static void main(String[] args) {
        LinguisticVariable temp = new LinguisticVariable("Temp", 0, 40);
        temp.addFuzzySet(new FuzzySet("Cold", new TriangularMF(0, 0, 20)));
        temp.addFuzzySet(new FuzzySet("Hot", new TriangularMF(20, 40, 40)));

        LinguisticVariable fan = new LinguisticVariable("Fan", 0, 100);
        fan.addFuzzySet(new FuzzySet("Low", new TriangularMF(0, 0, 50)));
        fan.addFuzzySet(new FuzzySet("High", new TriangularMF(50, 100, 100)));

        RuleBase rules = new RuleBase();

        // IF Temp is Cold THEN Fan is Low
        Map<LinguisticVariable, FuzzySet> ant1 = new HashMap<>();
        ant1.put(temp, temp.getFuzzySet("Cold"));
        Rule r1 = new Rule(ant1, Collections.singletonMap(fan, fan.getFuzzySet("Low")), Rule.RuleOperator.AND);
        rules.addRule(r1);

        // IF Temp is Hot THEN Fan is High
        Map<LinguisticVariable, FuzzySet> ant2 = new HashMap<>();
        ant2.put(temp, temp.getFuzzySet("Hot"));
        Rule r2 = new Rule(ant2, Collections.singletonMap(fan, fan.getFuzzySet("High")), Rule.RuleOperator.AND);
        rules.addRule(r2);

        FuzzyLogic system = FuzzyLogic.createMamdaniSystem(rules);

        Map<LinguisticVariable, Double> inputs = Map.of(temp, 30.0);
        Map<LinguisticVariable, Double> outputs = system.infer(inputs);

        System.out.println(outputs);
    }
}

Fuzzy (Sugeno): where to look

Sugeno inference uses numeric consequents. For a complete example of setting consequents per rule, see:

  • caseStudies.DrowsinessDetectionFL_Sugeno

3. Neural Network (softcomputing.NeuralNetwork)

A lightweight feed-forward neural network with backprop training.

Key concepts:

  • NNConfig: builder-based configuration for layer sizes, activations, learning rate, etc.

  • NeuralNetwork: provides train(...) and predict(...).

  • Dataset: stores inputs and targets as double[][].

  • Network: softcomputing.NeuralNetwork.core.NeuralNetwork (feed-forward)

  • Configuration: softcomputing.NeuralNetwork.config.NNConfig

  • Activations: Sigmoid, ReLU, Tanh, Linear

  • Loss: MeanSquaredError, CrossEntropy

  • Initialization: XavierInitializer (and others under initialization/)

  • Data: softcomputing.NeuralNetwork.data.Dataset

NN: minimal example

import softcomputing.NeuralNetwork.activation.ReLU;
import softcomputing.NeuralNetwork.activation.Sigmoid;
import softcomputing.NeuralNetwork.config.NNConfig;
import softcomputing.NeuralNetwork.core.NeuralNetwork;
import softcomputing.NeuralNetwork.data.Dataset;
import softcomputing.NeuralNetwork.initialization.XavierInitializer;
import softcomputing.NeuralNetwork.loss.MeanSquaredError;

public class SimpleNN {
    public static void main(String[] args) {
        double[][] X = { {0, 0}, {0, 1}, {1, 0}, {1, 1} };
        double[][] y = { {0}, {1}, {1}, {0} };
        Dataset dataset = new Dataset(X, y);

        NNConfig config = new NNConfig.Builder()
            .layerSizes(2, 4, 1)
            .activations(new ReLU(), new Sigmoid())
            .learningRate(0.05)
            .epochs(200)
            .batchSize(1)
            .weightInitializer(new XavierInitializer(42))
            .lossFunction(new MeanSquaredError())
            .build();

        NeuralNetwork nn = new NeuralNetwork(config);
        nn.train(dataset, config.getEpochs(), config.getLearningRate(), config.getBatchSize());

        double[] pred = nn.predict(new double[] { 1, 0 });
        System.out.println("Pred(1,0) = " + pred[0]);
    }
}

Case Studies / Examples

All examples are under src/main/java/caseStudies/.

  • GA Job Scheduling: caseStudies.JobSchedulingGA (call run())
  • Fuzzy Drowsiness Detection (Mamdani): caseStudies.DrowsinessDetectionFL_Mamdani (call run(...))
  • Fuzzy Drowsiness Detection (Sugeno): caseStudies.DrowsinessDetectionFL_Sugeno (call run(...))
  • NN Iris Classification: caseStudies.CaseStudyIRIS.IrisClassificationNN
    • Dataset included at src/main/java/caseStudies/CaseStudyIRIS/Iris.csv
  • NN Hotel Booking Prediction: caseStudies.CaseStudyNN.CaseStudyNN
    • Note: DATASET_PATH is currently an absolute path inside the class; update it to point to your local results.csv.

Which case studies are directly runnable?

Classes that contain public static void main(String[] args):

  • caseStudies.CaseStudyIRIS.IrisClassificationNN
  • caseStudies.CaseStudyNN.CaseStudyNN
  • caseStudies.caseStudyOLD.CaseStudyTEST

Other case-study classes (e.g., JobSchedulingGA, DrowsinessDetectionFL_*) are designed to be invoked from another main method by calling their run(...) methods.


Build

mvn clean package

Output goes to target/.


Use as a dependency (local)

If you want to use this library from another Maven project, first install it to your local Maven repository:

mvn clean install

Then add it in the other project’s pom.xml:

<dependency>
  <groupId>softcomputing</groupId>
  <artifactId>SoftComputingGA</artifactId>
  <version>1.0-SNAPSHOT</version>
</dependency>

Run (command line)

This project does not include an exec-maven-plugin configuration, so the most reliable way from the terminal is:

  1. Build:
mvn -DskipTests package
  1. Run a class with a main method from target/classes:
java --enable-preview -cp target/classes caseStudies.CaseStudyIRIS.IrisClassificationNN

Other runnable entry points:

  • caseStudies.CaseStudyNN.CaseStudyNN
  • caseStudies.caseStudyOLD.CaseStudyTEST

Minimal usage snippets

GA: Job Scheduling

import caseStudies.JobSchedulingGA;
import java.util.List;

List<Integer> jobsTime = List.of(3, 7, 5, 2, 8, 4);
List<Integer> machinesLimit = List.of(12, 10, 10);

JobSchedulingGA jobScheduling = new JobSchedulingGA();
jobScheduling.setNumMachines(machinesLimit.size());
jobScheduling.setMachineLimit(machinesLimit);
jobScheduling.setNumJobs(jobsTime.size());
jobScheduling.setJobTime(jobsTime);
jobScheduling.run();

Fuzzy Logic: Drowsiness Detection

import caseStudies.DrowsinessDetectionFL_Mamdani;

new DrowsinessDetectionFL_Mamdani().run(
    18.0,  // BlinkRate
    12.0,  // HeadTilt
    0.45   // SPV
);

NN: Iris Classification

mvn -DskipTests package
java --enable-preview -cp target/classes caseStudies.CaseStudyIRIS.IrisClassificationNN

Project Structure

src/main/java/
├── Main.java
├── caseStudies/
│   ├── JobSchedulingGA.java
│   ├── DrowsinessDetectionFL_Mamdani.java
│   ├── DrowsinessDetectionFL_Sugeno.java
│   ├── CaseStudyIRIS/
│   │   ├── Iris.csv
│   │   └── IrisClassificationNN.java
│   ├── CaseStudyNN/
│   │   └── CaseStudyNN.java
│   └── caseStudyOLD/
└── softcomputing/
    ├── GeneticAlgorithm/
    ├── FuzzyLogic/
    └── NeuralNetwork/

Contributors

About

Java library that implements several soft computing techniques and provides some case studies to show how the library can be used for real-world applications.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages