Angel ML: A High-Performance and Full-Stack Distributed Machine Learning Platform

Angel, a graduate LF AI Foundation project, is a high-performance distributed machine learning platform for enterprise applications, and was originally developed and open-sourced by Tencent.

It provides full-stack facilities for feature engineering, model building, parameters turning, model serving, and AutoML, including algorithms like statistical learning, neural networks and graph computing. The mass adoption of Angel in Tencent demonstrates its advantages in training super higher dimension models even with billions of features. Angel offers several deployment options such as Docker, Yarn and Kubernetes, making it a reliable and efficient solution to machine learning applications for enterprises. We encourage you to download and read the latest paper on Angel covering the latest major release 3.0.

Efficient

High-performance distributed machine learning platform with full-stack facilites

Reliable

Advantages in training super higher dimension models even with billions of features

Scalable

Offer several deployment options such as Docker, Yarn and Kubernetes

Open Source

Angel was open-sourced in 2017 on Github, then the community has been growing rapidly with 5.8k stars, 1.4k forks and 7 sub projects within just 3 years. As well, Angel joined LF AI Foundation in August 2018 and graduated in December 2019 as the first top-level project in China from LF AI Foundation. Please visit us on GitHub where our development happens.

We also invite you to join our community both as a user of Angel and also as a contributor to its development. We look forward to your contributions!

Getting Started

Execution Environment Requirements:

  1. Java version 1.8
  2. Angel distribution package angel-<version>-bin.zip, Download distribution package on the release page. Unpacking the distribution package, four subdirectories will be generated under the root directory:
    1. bin: contains Angel submit scripts,
    2. conf: contains system config files,
    3. data: contains data for testing, and,
    4. lib: contains jars for Angel and dependencies.

Running on local:

Once the distribution package is unzipped, find the bin directory under the root, and that’s where all the submit scripts are located. An example of running simple logistic regression can be found at:

./angel-example com.tencent.angel.example.ml.LogisticRegLocalExample

The result is saved in /tmp/model.

Contribute

Angel maintains three mailing lists. You are invited to join the one that best meets your interest.

Users

Many other organizations are using Angel. To add yours please contact us.

Join the Conversation

Angel maintains three mailing lists. You are invited to join the one that best meets your interest.