neet alternatives and similar packages
Based on the "AI" category.
Alternatively, view neet alternatives based on common mentions on social networks and blogs.
-
tensor-safe
A Haskell framework to define valid deep learning models and export them to other frameworks like TensorFlow JS or Keras. -
moo
Genetic algorithm library for Haskell. Binary and continuous (real-coded) GAs. Binary GAs: binary and Gray encoding; point mutation; one-point, two-point, and uniform crossover. Continuous GAs: Gaussian mutation; BLX-α, UNDX, and SBX crossover. Selection operators: roulette, tournament, and stochastic universal sampling (SUS); with optional niching, ranking, and scaling. Replacement strategies: generational with elitism and steady state. Constrained optimization: random constrained initialization, death penalty, constrained selection without a penalty function. Multi-objective optimization: NSGA-II and constrained NSGA-II. -
simple-genetic-algorithm
Simple parallel genetic algorithm implementation in pure Haskell -
cv-combinators
Functional Combinators for Computer Vision, currently using OpenCV as a backend -
simple-neural-networks
Simple parallel neural networks implementation in pure Haskell -
HaVSA
HaVSA (Have-Saa) is a Haskell implementation of the Version Space Algebra Machine Learning technique described by Tessa Lau. -
Etage
A general data-flow framework featuring nondeterminism, laziness and neurological pseudo-terminology. -
CarneadesDSL
An implementation and DSL for the Carneades argumentation model. -
simple-genetic-algorithm-mr
Fork of simple-genetic-algorithm using MonadRandom -
attoparsec-arff
An attoparsec-based parser for ARFF files in Haskell
Updating dependencies is time-consuming.
* Code Quality Rankings and insights are calculated and provided by Lumnify.
They vary from L1 to L5 with "L5" being the highest.
Do you think we are missing an alternative of neet or a related project?
README
NEET
NEET is a Haskell library for evolving NEAT neural networks. I wrote it because I saw MarI/O (https://www.youtube.com/watch?v=qv6UVOQ0F44), a neat application of NEAT to playing Super Mario World. I plan on using this package to mess around with AI.
Features
- Lots of parameters
- Training networks
- Using networks
- Parallel fitness evaluation
- Phased Search
- Training in an arbitrary monad
Planned Features
- Serialization
- Better rendering
- (Maybe) Parallel evaluation of breeding functions
Lofty Dreams
- CPPN and HyperNEAT support
What is NEAT?
NEAT, or Neuroevolution of Augmenting Topologies, is a genetic algorithm for evolving neural networks. When NEAT was developed, its novel use of historical markers for the genes encoding neural connections allowed it to "cross over" those genes (like in real meiosis), while sidestepping problems caused by the fact that different neural networks could use similar connections for different purposes. Instead of trying to analyze the network's shape, the algorithm simply matches up genes with the same ID, and crosses those over.
Historical markers also make it easier to group different networks into species, as the markers provide a genetic record that can be used to determine relatedness of two genomes. Being able to speciate is beneficial, as it prevents the population from converging on a non-optimal solution as easily. This is possible because organisms compete primarily within their own species, which maintains genetic diversity and in turn allows the development of several approaches to a problem.
The diversity provided by species itself enables NEAT to start with a minimal network (usually a fully connected network of only inputs and outputs), as it can build diversity up. At the time NEAT was published, previous genetic neuroevolution algorithms instead started networks with more neurons and connections to generate an initial pool of diversity. This can slow down the learning process, as more weights need to be tuned, and the extra complexity might not have been necessary for a solution anyway.