Skip to content

Latest commit

 

History

History
179 lines (162 loc) · 18.2 KB

AI.md

File metadata and controls

179 lines (162 loc) · 18.2 KB

Artificial Intelligence, Machine Learning, Neural Networks, NLP, Speech Recognition and Voice tools,...


MACHINE LEARNING

Machine learning and statistics are closely related fields, so do check out the Statistics page for more packages.

  • BackpropNeuralNet.jl :: A neural network in Julia.
  • BNMF.jl :: Gamma Process Non-negative Matrix Factorization (GaP-NMF).
  • ConfidenceWeighted.jl :: Confidence weighted, a machine learning algorithm.
  • Contingency.jl :: Assorted techniques for the purpose of enabling automated machine learning.
  • Clustering.jl :: Basic functions for clustering data ==> k-means, dp-means, etc..
  • DAI.jl :: A julia binding to the C++ discrete approximate inference library for graphical models: libDAI.
  • DecisionTree.jl.
  • DecisionTrees.jl.
  • DeepQLearning.jl :: An implementation of DeepMind's Deep Q Learning algorithm described in Playing Atari with Deep Reinforcement Learning.
  • Discretizers.jl :: A package to support discretization methods and mapping functions for data discretization and label maps.
  • EGR.jl :: The Stochastic Gradient (SG) algorithm for machine learning.
  • ELM.jl :: Extreme Learning Machines are a variant of Single-Hidden Layer Feedforward Networks (SLFNs) with a significant departure as their weights aren't iteratively tuned. This boosts the speed of neurals nets heavily.
  • FeatureSelection.jl :: Common measures and algorithms for feature selection.
  • Flimsy.jl :: Gradient based Machine Learning for Julia.
  • Flux.jl :: A library for machine learning implemented in Julia. (Alpha stage)
  • FunctionalDataUtils.jl :: Utility functions for the FunctionalData package, mainly from the area of computer vision / machine learning.
  • go.jl :: A deep learning based Go bot implemented in Julia.
  • GradientBoost.jl :: Gradient boosting framework for Julia.
  • GURLS.jl :: A pure Julia port of the GURLS supervised learning library.
  • Glmnet.jl :: Julia wrapper for fitting Lasso/ElasticNet GLM models using glmnet.
  • HopfieldNets.jl :: Discrete and continuous Hopfield networks in Julia.
  • HSIC.jl :: Julia implementations of the Hilbert-Schmidt Independence Criterion (HSIC).
  • KaggleDigitRecognizer.jl :: Julia code for Kaggle's Digit Recognizer competition.
  • KDTrees.jl :: KD Trees.
  • Kernels.jl :: A Julia package for Mercer kernels and Gramian matrix calculation/approximation functions used in kernel methods of machine learning.
  • Knet.jl :: A machine learning module implemented in Julia.
  • kNN.jl :: The k-Nearest Neighbors algorithm in Julia.
  • KSVM.jl by @remusao :: Kernel Support Vector Machine (SVM) written in Julia.
  • KSVM.jl by @Evizero :: Support Vector Machines in pure Julia.
  • Ladder.jl :: A reliable leaderboard algorithm for machine learning competitions.
  • Learn.jl :: Base framework library for machine learning packages.
  • LearnBase.jl :: Abstractions for Julia Machine Learning Packages.
  • liblinear.jl :: Liblinear binding to Julia.
  • LIBSVM.jl :: Julia bindings for LIBSVM.
  • NMF.jl :: A Julia package for non-negative matrix factorization (NMF).
  • MachineLearning.jl :: is a Machine Learning library package that consolidates common machine learning algorithms written in pure Julia and presents a consistent API.
  • Merlin.jl :: Flexible Deep Learning Framework in Julia.
  • Milk.jl :: Machine Learning Kit.
  • MLKernels.jl :: Mercer kernels and Gramian matrix calculation/approximation.
  • Mocha.jl :: A Deep Learning framework for Julia, inspired by the C++ Deep Learning framework Caffe.
  • MochaTheano.jl :: Allow use of Theano for automatic differentiation within Mocha, via PyCall.
  • MXNet.jl :: Flexible and efficient deep learning in Julia.
  • Ollam.jl :: OLLAM = Online Learning of Linear Adaptatable Models.
  • OnlineAI.jl :: Machine learning for sequential/streaming data. {Usable: 3, Robust: 3, Active: 3}
  • Orchestra.jl :: Heterogeneous ensemble learning package for the Julia programming language.
  • POMDPs.jl :: A Julia framework for solving Markov decision processes and reinforcement learning.
  • PrivateMultiplicativeWeights.jl :: Differentially private synthetic data.
  • ProjectiveDictionaryPairLearning.jl :: Julia code for the paper S. Gu, L. Zhang, W. Zuo, and X. Feng, “Projective Dictionary Pair Learning for Pattern Classification,” In NIPS 2014.
  • RegERMs.jl :: A package implementing several machine learning algorithms in a regularised empirical risk minimisation framework (SVMs, LogReg, Linear Regression) in Julia.
  • SALSA.jl :: _S_oftware Lab for _A_dvanced Machine _L_earning and _S_tochastic _A_lgorithms is a native Julia implementation of the well known stochastic algorithms for linear and non-linear Support Vector Machines.
  • ScikitLearn.jl :: Julia implementation of the scikit-learn API.
  • ScikitLearnBase.jl :: Definition of the ScikitLearn.jl API.
  • SimpleML.jl :: Textbook implementations of some Machine Learning Algorithms in Julia.
  • SFA.jl :: Implementation of the standard SFA (Slow Feature Analysis) algorithm (both linear and non-linear signal expansion) in Julia.
  • SoftConfidenceWeighted.jl :: Exact Soft Confidence-Weighted Learning.
  • Strada.jl :: A deep learning library for Julia based on Caffe.
  • SupervisedLearning.jl :: Front-end interface for supervised machine learning.
  • SVMLightLoader.jl :: Loader of svmlight / liblinear format files.
  • TensorFlow.jl :: A Julia wrapper for TensorFlow, the open source machine learning framework from Google.
  • TSVD.jl :: Truncated singular value decomposition with partial reorthogonalization.
  • YCaret.jl :: Machine learning utility functions in Julia.
  • ValueHistories.jl :: Utilities to efficiently track learning curves or other optimization information.
Resources

HMM


NEURAL NETWORKS

  • ANN.jl :: Artifical Neural Networks.
  • Boltzmann.jl :: Restricted Boltzmann Machines and Deep Belief Networks in Julia
  • FANN.jl :: A Julia wrapper for the Fast Artificial Neural Network Library (FANN).
  • hinton.jl :: Create hinton diagrams in Julia. Hinton diagrams are used to visualize weight matrices in neural networks.
  • Julia_Neural_Network :: Basic Neural Network written in JuliaLang.
  • KUnet.jl :: Neural network code based on Julia and CUDA.
  • mlpnnets.jl :: Feed-forward MLP neural network implementation.
  • MultiLabelNeuralNetwork.jl :: A simple feed-forward neural network for multi-label classification.
  • neural.jl :: is a Julia implementation of a neural network, based on Sergio Fierens Ruby version.
  • NeuralNets.jl :: Generic artificial neural networks in Julia.
  • neuralnetwork.jl :: is an implementation of label neural network originally written for MATLAB/Octave by Andrew Ng for Coursera Machine Learning Class.
  • NeuralNetworks.jl :: Various functions for Neural Networks implemented in Julia.
  • RecurrentNN.jl :: Deep RNN and LSTM in Julia.
  • RNN.jl :: Recurrent Neural Networks.
  • SimpleNets :: Simple neural nets implementions in Julia.
  • SpikeNet.jl :: A spiking neural network simulator written in Julia.
  • StackedNets.jl :: A simple interface to deep stacks of neural network units that can be trained using gradient descent over defined error measures.
Resources

NLP

Japanese

  • MeCab.jl :: Julia binding of Japanese morphological analyzer MeCab.
Resources
  • Text-Benchmarks :: Comparing Python v. Clojure v. Julia performance in text-processing and dynamic collections.

SPEECH RECOGNITION

  • MelGeneralizedCepstrums.jl :: It provides a mel generalized cepstrum analysis for spectrum envelope estimation, which includes linear predicition, mel-cepstrum, generalized cepstrum and mel-generalized cepstrum analysis for Julia.
  • SpeechBase.jl.
  • SPTK.jl :: A Julia wrapper for the Speech Signal Processing Toolkit (SPTK), based on the modified version of SPTK.
  • SynthesisFilters.jl :: Speech Synthesis Filters.
  • WORLD.jl :: A Julia wrapper for WORLD - a high-quality speech analysis, modification and synthesis system. WORLD provides a way to decompose a speech signal into: Fundamental frequency (F0), spectral envelope, excitation signal (or aperiodicy used in TANDEM-STRAIGHT), and re-synthesize a speech signal from these paramters. See here for the original WORLD.